close
close

Waymo’s new safety data is impressive and instructive

Waymo’s new safety data is impressive and instructive

Waymo, the robotaxi division of Alphabet/Google, has released a new analysis of its accident data. The Waymo study claims that after 22 million miles of driving without a driver at the wheel, mostly in Phoenix but also in San Francisco, its vehicles have far fewer accidents of type three than human drivers driving the same number of miles in the same regions and on the same roads.

Specifically, Waymo reports 84% ​​fewer crashes due to airbag deployment, 76% fewer crashes involving injuries, and 48% fewer crashes reported to police. The company also claims that just under half of the crashes it recorded involved a speed difference of less than 1 mph from the other vehicle (such crashes are unlikely to involve airbags, injuries, or police).

They have published research on this that has passed peer review. It's not perfect, but it's a step forward from what we've seen in the past. There's a great temptation for all teams in the self-driving car space to present their safety statistics in the most positive light possible. Tesla has gone so far in this regard that it has published extremely misleading numbers. The gold standard would be an independent third-party review, but the results of a peer review are a step forward.

It's about risk

Waymo's reporting is a challenge for all operators, and I hope they will embrace it. More specifically, if they don't embrace it and publish their own numbers with similar candor, it will create a strong impression that they may have a reason not to publish them because their numbers are not as good, and potentially quite bad.

Regulators and the public need to understand these numbers, because to understand whether robocars are useful and dangerous, it is a mistake to look at individual incidents. Every incident (at least when the self-driving system was at fault) represents a problem that should not be occurring, and this has led many to voice the opinion after a single incident that “these vehicles are not ready” or that they should be banned from the roads. No single incident, even a tragic one, can tell us that.

What we want to understand is risk. There is risk in every type of driving. Some are more risky than others. When we drive too fast, we take risks (to others as well as to ourselves). When people drink, they take a lot of risk (although most drunk drivers actually make it home). Every new teen driver takes more risk with every trip than typical mature drivers. We accept this risk in order to be mobile.

So while it would be wrong to send a vehicle on the road with a known serious defect, it is acceptable to send a vehicle on the road that poses risks, as long as that risk is below a certain level. While Waymo knows that if they let their fleet drive 10 million miles, it's quite likely that someone will get hit, so does Fedex when they send their drivers on the road, or Uber.

Ordinary citizens cannot escape attention to incidents. It is the job of regulators to look at the bigger picture and assess the risks.

Waymo's statistics cover all accidents involving their vehicles, regardless of fault. It can be difficult to determine fault sometimes, especially if you're biased, but police do it thousands of times every day. Insurance companies do it too, so Waymo asked SwissRe, a large reinsurance company, to do a risk analysis of their findings. This was actually assessing fault, as fault would lead to insurance liability claims. Impressively, this report found that Waymo was not liable for any of the personal injury accidents, meaning they were not at fault.

Although most of Waymo's miles are driven on the comfortable streets of Phoenix, 22 miles of no liability for personal injury is a very impressive result, especially when it's a third party. While regulators and lawmakers puzzle over how to regulate autonomous driving, the answer probably lies here. Let experts like insurers calculate the risk and let the vehicles drive so they can learn and improve, as long as that risk isn't higher than human drivers. In the future, their risk levels will be even better, and we don't want to delay that day.

While Cruise hasn't released data as good as Waymo, they have released an analysis comparing themselves to TNC (Uber/Lyft) drivers and it shows positive results. Let's take a closer look. Cruise, of course, had the famous and tragic incident where a woman was hit by a human driver and thrown from her vehicle, but they then attempted to pull over after she stopped, dragging her and aggravating her injuries. Thankfully, she recovered, although it easily could have been worse. The California Department of Transportation responded by quickly shutting down Cruise in California and revoking their permits, citing two major issues. The first was that Cruise wasn't safe enough (due to this incident and possibly others). Perhaps even bigger was that they briefly attempted to hide key details. As usual, the cover-up becomes greater than the original sin.

The DMV should explain its reasons. Currently, companies have to fear that a serious incident (even if caused by a third party) could result in their fleet being taken off the roads, potentially derailing their entire project. Cruise has been delayed for over a year. They need to know where the bar is and whether this is a serious incident, a cover-up, or both. Cruise has settled for fines for the cover-up (as cover-ups go, it was one of the shortest) but remains off the roads. The DMV has not responded to multiple requests for comment on this issue.

All teams around the world should match or exceed Waymo in releasing data that helps us determine the risk they generate. While companies operate for profit rather than the public interest, the key to regulating them is finding ways to make them put the public interest above their financial interest. All companies declare this and say safety comes first, but the proof is in the pudding like this. For regulators, it is their duty to make all roads safer for all users, and robocars offer a way to do that, as long as the focus is on statistics like these and, difficult as it is, not the horror of individual incidents.

Related Post