This article is more than 1 year old

For once, Uber takes it up the tailpipe: Robo-ride gets rear-ended

First California accident, caused by inconsiderate human

The Uber self-driving program has had its first accident in California since regaining permission to experiment on the roads – and for a change, it wasn't Uber's fault.

According to an accident report [PDF] filed with the San Francisco Department of Motor Vehicles (DMV), the Uber car – a Volvo XC90 – was waiting to turn right across a busy road in the Inner Richmond district in San Francisco when it was hit from behind.

The Uber vehicle had stopped on Geary Boulevard to turn onto 3rd Street and was waiting for pedestrians to cross – a common event in bustling San Francisco. The Uber operator had even disengaged the self-driving system (presumably to avoid the risk of running down people) when a Toyota Tacoma rear-ended it.

It doesn't sound like a big collision: Uber reported only "minor damage" to the Volvo but it was enough to cause "wrist discomfort" to the co-pilot. They didn't call the police and no injuries were reported. The crash happened on August 16 at 2:55pm and was reported on Friday.

However, Uber has to be careful given its testing status with the DMV and the fact that people remain understandably cautious about huge chunks of metal being moved around at great speed by computers.

Uber is on thin ice with the California authorities, despite the state leading the way with self-driving technology. Back in December, the DMV revoked the registrations of 16 Uber self-driving cars after the company decided it didn't need to apply for a permit to let its computer drive around San Francisco. Turns out it did.

Not only that, but the company's cars were failing to understand and respond legally to bike lanes, presumably because the software engineers hadn't thought about that.

Freaking out

We also now know from texts sent between former Uber CEO Travis Kalanick and its former head of self-driving tech Anthony Levandowski that the Palo Alto DMV in Silicon Valley was also very nervous about self-driving truck tests that were being carried out by Levandowski's Ottomotto, prior to the company being bought by Uber.

"Just wrapped with the DMV, it was the city of Palo Alto freaking out about AV trucks testing and were asked to investigate. The guys were happy with our answers and we're in the clear," Levandowski texted.

The intersection in San Francisco where the accident happened earlier this month

Despite claims that self-driving technology will ultimately save thousands of lives (because a lot of humans are, frankly, terrible drivers), and despite some lawmakers going to bat for the very large tech companies racing to perfect the technology, the fact remains that people are very skeptical about the idea.

And for good reason. When Uber was booted out of California, it moved its testing to Arizona. Soon after, one of its cars got involved in a crash in which it was T-boned at an intersection and ended up on its side.

Uber claimed it wasn't its fault – it was driving under the speed limit and had done nothing wrong. Except an investigation into the crash revealed that the Uber went through a yellow light at 38mph (in a 40mph zone). The Uber Volvo SUV was hit as it crossed an intersection by an oncoming car that was turning left.

It was legal, but one witness said that it was the Uber car's fault for "trying to beat the light and hitting the gas so hard." And another said that the car had come "flying through the intersection."

Caution

And that is the key distinction: where a human driver would typically exercise caution, even if they were legally in the right, a computer follows the rules it has been programmed with.

The hope is that with thousands of hours of testing, the self-driving programs will build up sufficient knowledge of different scenarios to drive more safely than human drivers.

The big question is: where do we allow the cut-off? When a self-driving car is better than the average human driver? Or do we need a higher standard for computer-driven cars? And how do we measure that?

Of course, while Uber is playing things safe and getting hit by other cars, Tesla is somewhat arrogantly insisting that the death of one driver thanks to its systems' failing wasn't its fault because the human driver didn't notice the truck either.

And then there was the curious case of the crash where the driver decided that auto-pilot wasn't actually on when he had previously told the authorities it was. It had nothing to do with the phone call he'd had with Tesla in between. ®

More about

TIP US OFF

Send us news


Other stories you might like