This article is more than 1 year old

Waymo robo autos way mo' primo at avoiding-o wreck-os (yeah, yeah, we ran out of rhymes)

Humans in self-driving cars hit panic button far less in 2016

Self-driving cars may actually learn how to drive well enough to be deployed without human oversight some day, legislation and society permitting.

Waymo – previously known as Google's self-driving car project until it was spun out as a separate company last December – on Wednesday reported how its cars performed during California road tests in 2016. The results show the technology is improving.

The company's 2016 Disengagement Report, filed with the California Department of Motor Vehicles, documents incidents when the vehicle test driver had to disengage auto-driving systems either to avoid an anticipated safety risk or in response to a technology failure.

Disengagements were recorded for events like software discrepancies, perception discrepancies, unwanted maneuvering, reckless behavior of others on the road, the approach of emergency vehicles, incorrect prediction of the behavior of other traffic, construction zones, and roadway debris.

In 2016, Waymo vehicles (under the Google flag during the year) drove 645,868 miles, up from 424,331 in 2015, and had 124 reportable disengagements, down from 341 in 2015.

Overall, the rate of disengagements per 1,000 miles dropped from 0.80 to 0.20, a reduction of 75 per cent.

Disengagements specifically related to safety declined from 0.16 per 1,000 miles to 0.13, a 19 per cent reduction.

According to the National Highway Traffic Safety Administration, there were 78 injuries per 100 million vehicle miles of travel in 2015. The number of safety-related disengagements for that much travel would amount to 13,000 given the present performance of Waymo's systems. It's not clear what fraction of those incidents would result in injuries without human intervention.

Dmitri Dolgov, head of Waymo's self-driving technology, attributes the company's progress in part to applying what it learns from disengagement events to driving simulations that lead to further refinements.

"We've been able to make dramatic improvements to our technology because we use each of these disengages to teach and refine our car (that's why we set our thresholds for disengages conservatively)," said Dolgov in a blog post. "For each event we can create hundreds – and sometimes thousands – of related scenarios in simulation, varying the parameters such as the position and speed of other road users in the area."

According to Dolgov, Waymo's simulated vehicles drove more than a billion miles last year.

While autonomous vehicles may eventually offer safer travel than human-driven cars, safety alone will not make self-driving cars the preferred transportation option. The technology will have to work on legal, political, social and economic levels too. ®

More about

TIP US OFF

Send us news


Other stories you might like