On Sunday the first known case of a self-driving car being involved in a fatal accident occurred.
Yes, there was an earlier fatality when a Tesla crashed while in autopilot mode, back in 2016, and its ‘driver’ was killed; but that case was deemed to only very partially be Tesla’s fault. The driver had ignored seven warnings immediately prior to the crash, asking him to reassert control over the car. Unlike the Tesla situation, this case involved a fully self-driving car, although it also had a human ‘safety driver’ in the vehicle too.
Let’s first carefully state the facts then ponder their meaning and the broader issues.
It seems the vehicle was proceeding north up a road that had two lanes of traffic going north, a wide median strip, and on the other side, a matching two lanes of traffic going south. The accident occurred at 10pm, when it was fully dark.
The vehicle was apparently driving at 38 mph, and the speed limit was 45 mph. Traffic was light, and the road ahead was clear. The vehicle was in the right hand of the two lanes of northbound traffic and seemed to be proceeding in a proper and lawful manner in every respect.
Just past a cut in the median, as the car was proceeding, a woman in dark clothing appeared in the headlights, very close, walking across the road while pushing a bicycle. She had already crossed from the other side, gone over the median strip, through the left lane of traffic, and suddenly appeared in the middle of the right lane, directly in front of the vehicle, on her way to the other side of the road. It was not a crosswalk, although there was one approximately 400 ft away. There were occasional streetlights, but the woman chose a spot between streetlights that was poorly lit.
Neither the vehicle itself nor the safety driver had time to react/respond. She died shortly thereafter from the injuries suffered.
Video from the vehicle’s video camera has now been released. You can see what the car saw, yourself, here. Please keep in mind, while doing so, that you are staring intently at the video and straining to see the hidden pedestrian that you know is somewhere out there. Try and guess at how you would have reacted, when just driving along normally, with what appears to be an ordinary empty road stretching out in front of you.
There is also a second video clip taken from the vehicle’s dash, recording the ‘safety driver’. It shows the safety driver to have been inattentive, but by coincidence, looking up and out at the road just a second or so prior to the accident, and then a classic look of horror and surprise appears on the driver’s face. That can be seen here.
Two things happen at night, as we all know. The first is that cars become easier to see, because they have bright headlights standing out from the darkness that is otherwise all around them. It actually becomes easier for pedestrians to see cars at night rather than during the day.
The second is that there is no reciprocity in this visual phenomenon – it becomes harder for drivers to see pedestrians, because they generally tend to fade into the half-light or total darkness outside of the main headlight beams. A person in dark clothing in particular can be very hard to see. The ‘invisible’ nature of the woman and the way she suddenly appeared as if from nowhere can be clearly seen in the video.
One is left with the uncomfortable realization that this woman was crossing four lanes of traffic, at night, in dark clothing, not at a cross-walk, and walked directly into the path of an oncoming vehicle that she surely must have clearly seen from some distance back. Perhaps she mistakenly thought she would be clearly seen, and perhaps she expected the vehicle to brake and give way to her. But she wasn’t seen, the vehicle didn’t stop, and neither did she appear to take any desperate last-minute avoidance, either.
The Tempe police chief has been quoted as saying that the accident would have been difficult to avoid, whether a person was driving or not, and that Uber’s self-driving vehicle is likely not at fault.
The Duty to Give Way
Most states make it plain that no matter who has right of way, all parties sharing the road have an overriding obligation to avoid accidents with other parties. The duty to avoid accidents is an overriding and shared duty.
But it seems clear that the underlying cause of this accident was the result of a deliberate decision on the part of this woman to walk across four lanes of traffic at a dangerous location.
No-one will be surprised to learn that the woman’s friends are now beatifying her and calling for a total end to all self-driving testing. A cynic would wonder why tragedies apparently only ever occur to wonderful people, but that’s perhaps a topic for another time. The friends haven’t offered any explanations for her actions, preferring instead to anoint her with praise and heap ordure on all self-driving vehicle programs.
Some pedestrians and cyclists become very aggressive at not giving way and passively aggressively challenging vehicles to either hit them or yield. We’ve all seen pedestrians like her, haven’t we – people who deliberately pretend they don’t see our car, forcing us to give way to them. But such passive aggressiveness usually occurs at slow speed in parking lots where they know they are visible, not at night on a dual carriageway with a posted 45 mph speed limit.
I have neighbors who view the public road as a children’s playground, and demand that their children have priority to play on the street, overriding the right of cars to proceed normally along the road (even though they live in single family dwellings with reasonable sized lots). In their opinion, placing a traffic cone on the street unilaterally changes it from a street to a children’s playground. And while that claim is totally wrong, the overriding duty to avoid accidents means I have to slow to a crawl or even stop while waiting for their children to grudgingly clear a space for me to then drive through.
Who Was at Fault?
In this case, it seems impossible to avoid the same conclusion the police have reached – the woman’s actions were the primary cause of this accident. She broke the law by not crossing at a cross-walk, and then earned a Darwin award when she voluntarily walked in front of an oncoming car.
But what about the duty of the car to avoid her, even in cases where she is acting irresponsibly? It is possible a human driver would have better night vision, seeing more detail in the dark than what the camera shows, but balancing that is it is also probable that unless the driver was extraordinarily alert and anticipating such an event, the first faint hints of a person on the road would have been overlooked and by the time the woman’s presence had been registered and responded to, it would have been too late to prevent the tragedy that followed.
It is more puzzling why the vehicle’s LIDAR sensors don’t seem to have detected the woman (they work as well in the dark as in the daytime and can see very small things – even a pedestrian’s outstretched arm and hand). (Here’s a brief but interesting example of what cars ‘see’ when driving by themselves.)
More likely, the LIDAR did detect the woman, but perhaps due to her slow movement, improbable location, and lack of confirmation from other sensor systems, they decided to ignore her as a vision ‘artifact’ rather than as a real genuine hazard. Self-driving cars can have problems with stationary objects; they may sometimes ignore them as being ‘noise’ and ‘errors’ rather than as being true objects to be responded to.
A perfect self-driving car would have done a better job of detecting and avoiding the woman, but before we start to demand perfection in our cars, isn’t it fair to first seek slightly more rational behavior on the part of pedestrians? What part of ‘don’t step out in front of a car, away from a cross-walk, away from street lights, in dark clothing, and on a multi-lane 45 mph highway, at night’ is hard to understand or comply with?
Would a full-time human driver in a car with no automation have done a better job of detecting and avoiding the woman? You might wish to think so, and perhaps if it was you at the wheel, fully alert and anxiously scanning the darkness for wayward pedestrians, that might have been the case. But in 2016, 5,997 pedestrians were killed by cars in the US (including many in daylight and even many on crosswalks) so the odds are clearly still very much against pedestrians in general. We are all taught to be defensive drivers, pedestrians need to be doubly schooled in the art of being defensive pedestrians.
Some Risk Statistics
To put this one accident in context, on the same day, probably 110 or more other people also died in traffic accidents on US roads, and another 12,600 were injured (2016 statistics). There are about 16,000 vehicle accident events every day, some involving only one vehicle, most involving two or more. This accident count is of course a rather conjectural number because many are unreported, and many are subjective – at what point does a trivial ‘fender bender’ become an ‘accident’?
This data needs to be viewed through two additional filters. First, it is fair to note that while the one death from a self-driving vehicle is less than 1% of that day’s total, the number of self-driving cars on the roads is way less than 1% of all vehicles. So that makes it a significant event in the annals of self-driving cars. Google’s Waymo self-driving car project claims to have amassed more than 5 million miles of self-driving experience, as well as billions of miles of less useful computer simulations, but even 5 million miles is a trivial distance on which to base any findings.
But that leads to the second point. In the entire history of self-driving vehicles – limited as it is, the total fatality count sits at one, whereas the 110 ‘ordinary’ fatalities is a daily count, increasing every day. In total, about 40,000 people are killed every year on US roads, a number which after steadily declining (probably due to safety innovations) has now started to climb again (perhaps reflecting no recent additional safety innovations and overall increases in cars on the roads and total miles driven).
To put this in further context, as deadly as motor vehicles are in total, the fatality rate is very low. Currently there are about 1.25 fatalities per 100 million vehicle miles traveled (2016 statistic). Most of us will go our entire lives without even driving 1 million miles, and so while we all know people who have had accidents or suffered deaths in their families from traffic accidents, the chances of it happening to us directly is clearly deemed by society to be acceptably low.
It is interesting to compare that to the safety of flying. For every person who is scared of driving (or of being a passenger) there are probably 100 or 1000 people who are afflicted with a much greater fear of flying. But the comparative fatality rate for commercial passenger jets, per 100 million passenger miles in 2017 was zero, and the total fatality count was also zero (compared to about 40,000 on the roads). Longer term, accounting for years when there are plane crashes with fatalities, the risk still remains as nearly zero, and too small a number to be very statistically significant. Ask yourself – how many people do you know of who died or were injured in car crashes? Now, how many do you know who died or were injured in plane crashes?
Perhaps it is simplest just to say that air travel is clearly more than ten times as safe as road travel, and probably substantially more than 100 times as safe.
The Safety Implications of Self-Driving Technologies
Almost all vehicular accidents can trace their cause back to an unoptimized action on the part of their driver. Maybe sometimes there are other factors present as well, but almost without exception, most accidents evolve through a process and at one point in that process, a bad decision on the part of the driver ends up causing the accident to occur rather than be avoided. If we can improve that decision-making process, we can reduce accidents.
Society seems to have reached a balancing point where it tolerates the annual road carnage as an acceptable price to pay in return for everyone having convenient access to driving licenses and the right to drive, even though requiring higher standards of driving competency would clearly lower the rate of accidents and deaths.
To illustrate that point, it is interesting to compare the chances of dying in a motor vehicle accident in ‘dangerous’ countries and in ‘safe’ countries. A commonly cited and several times updated study compares accident rates to population numbers, this is an imperfect measure because risk is more a function of miles driven, but with a 20-fold difference between safe and dangerous countries, clearly there are significant differences in accident rates that point to human factor variations and the different degrees of anarchy or compliance with road traffic laws in different countries. Countries in Scandinavia and Western Europe are rated as very safe, third-world countries are rated as very dangerous, according to this article.
Vehicular accidents can be categorized several different ways, and that can help us guess as to the degrees of improvement we might experience with self-driving vehicles. As stated above, most accidents that are caused by an inadequacy on the part of the driver. Call it an error, a mistake, inattention, impairment due to drugs or drinking, driving too fast, or whatever you like, but still something directly attributable to the driver and something which a better driver could have avoided.
Some of these are classic ‘single vehicle accidents’ in which the car simply leaves the road, or collides with a stationary object. It seems that slightly more than 50% of all accidents are single-vehicle accidents, and most of those are clearly driver error.
Impaired driving seems to be a factor in 61% of driver fatalities (same source) and that’s obviously another way of saying driver error. Speed is considered the main reason in 27% of fatal crashes and that also involves a driver error in judging the safe speed to proceed at.
Note that it is possible for one fatality to be simultaneously due to speed, impaired driving, and a single vehicle accident, so these three numbers sum to more than 100%.
We can fairly say that in many/most of the single vehicle accidents, self-driving cars might reduce the risk of such accidents. Similarly, in almost every impaired driving fatality, a self-driving car would eliminate that as a risk factor. And self-driving cars are unlikely to drive at unsafe speeds.
Other accidents point to a second category – accidents caused by another driver. You are driving perfectly in your car, but the actions of someone else create a situation where a reasonably skilled driver is unable to respond and prevent an accident from occurring. A self-driving car generally has better situational awareness, all around it, and so might detect an accident about to happen sooner than a human driver, and might then have the computer-speed reflexes to defensively maneuver and avoid or reduce the severity of the collision.
And, of course, if the other vehicle was a self-driving vehicle too, then the chances of it being driven unsafely would be reduced, and short-range vehicle-to-vehicle communication would help coordinate evasive action (sort of like “I’ll swerve to the right and slow down; you swerve to the left and accelerate to avoid me”).
A third category of accident would be true accidental events. This might be the failure of something in the car – the brakes fail, the steering wheel falls off, the accelerator jams, or whatever. A large animal runs onto the road. A boulder from high up on a cliff falls down and lands on a car roof, or in the road directly ahead. Self-driving cars would probably be no worse than human drivers at responding to such emergencies, and probably would be better. It takes an alert attentive driver a minimum of 0.75 seconds to react to an unexpected event, and at 50 mph, that means they’ve traveled 75 ft before even deciding to take their foot off the gas pedal and shift to the brake pedal. It seems realistic to expect a computer to detect, analyse, decide how to respond and actually do so in perhaps one tenth the time. That saves almost 70 ft of travel at 50 mph, which many times will give a situation the edge it needs to resolve to a more minor outcome without fatalities.
Most experts believe that self-driving cars will be substantially safer than cars we drive ourselves. There is no agreement on an exact number for how much safer, but clearly, just looking at the 61% of fatalities from impaired driving alone (and assuming that self-driving cars don’t create new risks) the improvement could be profound.
We’ve seen predictions of between 10 and 100 times improvements, and these of course depend on what the ‘base case’ that is being compared to might be, and also the degree of technological advancement assumed in the self driving vehicle. This sort of leads to our final point.
How Much Safer Should Self-Driving Cars Be?
Actually, this heading embodies an assumption right from the start. Why should a self-driving car be required to be any safer than a regular car? There are massive benefits already flowing through to society by freeing us from the chains of being unproductively stuck behind the wheels of our cars – wouldn’t simply maintaining the status quo be sufficient, in view of all the other benefits to self-driving cars?
Perhaps though, recognizing that there is a wide range of driving competencies and risks, it would be fair to require self-driving cars to be as safe as a skilled/experienced driver rather than as safe as an average or less skilled/experienced driver. We’re not quite certain as to the range of accident rates as between better and worse drivers, but would making self-driving cars twice as safe as average drivers be sufficient? That could see 20,000 fewer deaths each year, and 2.3 million fewer injuries. Surely that is an enormous improvement that we should be urgently rushing to implement.
Maybe you hold out for a higher level. How about four times safer that average? That is probably better than 95% of drivers currently on the roads, and would save 30,000 lives a year. With a four-fold reduction in accidents, we could also hope for a similar level of reduction in insurance premiums too.
Think also about the social improvements. Chances are that you know someone who has been affected (negatively) by an auto-accident, either directly or to a close family member. We know people who now have semi-permanent back/neck injuries, and of course, people who have been killed. The impacts of such things flow through our social structures. Eliminating or greatly reducing them would make many seemingly unrelated parts of our lives much better.
At a more trivial but still beneficial level, self-driving cars would brilliantly help with ‘fender bender’ type incidents that never even appear in formal accident statistics. Chances are you’ve been in many more fender benders that didn’t involve police and official accident reporting, and maybe sometimes didn’t even involve an insurance claim, but which did involve recriminations, regret, and hassles.
Self-driving cars can also help with congestion. Because they can react more quickly, they can safely follow more closely, and won’t irrationally slow down to look at ‘interesting things’, making freeways capable of handling more cars an hour. Fewer accidents also means fewer freeway blockages, making commuting and travel in general more predictable.
It is probably possible to make self-driving cars even safer still, but can we afford to wait for self-driving cars to be five or six times safer than average drivers before allowing their deployment? Shouldn’t we be ultra-urgently scrambling to release systems that are two, three, four times safer than current average drivers, and then phase in further improvements as technology allows.
The US government has just unveiled a plan to spend $100 million on developing self-driving car technology. But this is a paltry sum, a drop in the bucket compared to the billions currently being spent by the private sector, and all around the world. To measure that by another standard, it is estimated that each and every vehicular fatality represents an overall cost and loss of income of several million dollars. So the $100 million represents the savings from perhaps only a couple of dozen averted fatalities. Shouldn’t we be adding another zero or two or three to this sum, to accelerate the R&D, testing, and implementation of new technologies?
We can’t afford to wait until self-driving cars are ten or one hundred times safer. We need to deploy this technology sequentially, and urgently. The lives that will undoubtedly be saved may reach directly into your family and friends.
This specific accident seems to clearly be the result of the woman’s puzzling decision to act irrationally and walk out in front of the vehicle. Whether the vehicle was driving itself or had a normal driver, it probably would have happened either which way. While better self-driving technologies might protect against such things in the future, the current degree of self-driving capability does not seem to have contributed at all to this accident and so provides no reason to pause or restrict ongoing self-driving development.
We need to understand and accept that no matter how good they already are, and how much better they become, self-driving cars will at best only reduce rather than eliminate accidents.
But the promise of reducing the 40,000 road deaths and the 4.6 million reported injuries every year (these numbers apply to the US only) by any amount is so extraordinarily beneficial that we need to see the accidents that will happen in the context of all the other accidents that didn’t happen.
Self-driving cars will revolutionise our society in many beneficial ways. We need to press forward with their development and urgent implementation.