Yeah, just to reiterate, simple division says the Uber cars are 25x times as likely to kill as human drivers.
Turns out Uber’s cars are really crappy.
Cool, btw, that Arizona allows testing without restrictions, because autonomous vehicles are safer and the future. So they had no way to tell Uber’s cars were not working well. Great move, Arizona! Very forward looking!
Who knew that they were consided behind waymos technology. I guess that’s just some sort of bias for me since I see Ubers cars all the time and thus think they are doing very well.
A few nuggets from the NYT article (which you should read):
Uber has been testing its self-driving cars in a regulatory vacuum in Arizona. There are few federal rules governing the testing of autonomous cars. Unlike California, where Uber had been testing since spring of 2017, Arizona state officials had taken a hands-off approach to autonomous vehicles and did not require companies to disclose how their cars were performing.
Uber’s goals in Arizona were mentioned in internal documents — Arizona does not have reporting requirements
Waymo[‘s …] cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, […]
Cruise [GM] reported to California regulators that it went more than 1,200 miles per intervention
I wonder what the Pittsburgh stats are like. Maybe we get to find out before testing resumes? To speculate, Uber only arrived in Arizona a year ago. The others have been in the West for a lot longer. The technology incorporates “deep neural net” (DNN) models, which require significant amounts of data for training. For all such approaches training data is key. I expect that this incident shows that data really matters: Arizona is very different from Pittsburgh and the Pittsburgh models don’t seem to generalize to Arizona all that well. If that’s the case then this whole autonomous car thing is going to be a lot harder to get working than it appeared to date.
That article scared me re Uber driving around pittsburgh. I thought they had more experience and a better track record. Seems their goal is to rush a fully autonomous Uber to do routine pickups and dropoffs to get, maybe, through first mover effect… And save $$$ by getting rid of drivers.
The other autonomous car companies seemed to be backed by Google or the car industry and they seem to have a different goal in mind- safely delivering people from point to point.
I’d be inclined to believe that the Uber corporate culture is still pretty much what Kalanick forced it to be. But I’m also willing to believe that once you get down the chain far enough to encounter the auto-car people things are a bit more civilized. Not everyone strives to be a dick.
I expect things are about the same at the other companies, though the “it has to absolutely work by x deadline” attitude is simply a recipe for disaster. I’ve been there. Maybe the other companies just have more realistic expectations, or more money (assuming there’s a difference).
The NYT article is devastating. I don’t see how Uber’s autonomous vehicle project recovers. A new CEO, who wants to clean up and put his stamp on the company, will almost certainly kill the project, which has been so badly managed that it killed somebody.
BikePGH’s Eric Boerer spoke in this brief Marketplace radio story about autonomous cars, on 3/22:
I’ve been wondering lately how the pressure to increase their number of miles without intervention affects safety… this metric just encourages a riskier culture rather than a precautionary one.
In an earlier comment, https://www.bikepgh.org/message-board/topic/car-less-drivers/page/3/#post-352075, I said “if the car was using LIDAR, the LIDAR wasn’t working well”.
In this new article, http://www.theregister.co.uk/2018/03/27/uber_crash_safety_systems_disabled_aptiv/, we learn “Radar-maker says Uber disabled safety systems”. (I assume they mean LIDAR, not Radar).
I would like to see the original dashcam video without the text at the bottom of the screen blurred out. The text might say in bold letters, for example, “LIDAR OFF”.
The cars may be using radar as well as LIDAR. Certainly we experimented with both systems, back in the day.
(I assume they mean LIDAR, not Radar).
No, if you click through to the original article being quoted in the one you linked to, you’ll see they mean radar.
Volvos come standard with a collision-avoidance system that uses radar and cameras. That’s what Uber disabled. Presumably Uber didn’t want their own system getting overruled by the car’s built-in system during testing. If one safety system says to avoid the oncoming truck by veering left and the other one says to veer right, alternating between the two would be bad.
In this case, it appears the car’s own safety system might have done a better job than Uber’s, so perhaps their system should defer to the car’s built-in system until they can get it working better.
It’s not just the robots:
Uber Driver Blames GPS for Impromptu Stairs Incident
Listen to the 1A radio show (WESA 90.5 FM at 10am-12) on Thu 3/29 for discussion of the need to regulate self-driving cars. https://the1a.org/shows/2018-03-29/is-it-time-to-tap-the-brakes-on-self-driving-cars
That NYT article sensationalized is loose with facts and is sensationalized in order to capitalize on uber’s current notoriety.
For example, when citing the difference between automated driving companies, they make it sound like Uber is way behind in the miles between interventions metric. But when investigated further, those numbers don’t represent the same thing. The CEO of Waymo even felt the need to publicly state that the NYT article used that stat in a misleading way. The Waymo stat only includes disengagements which were found in later analyses to have prevented an accident. Both are useful metrics but the article used them in a misleading, false comparison.
This isn’t meant as support of uber or an attempt to disregard safety concerns. Instead, just countering what resembles mob mentality among the press and public. The safety of autonomous driving is an important topic. However, reasoned analysis has been largely replaced by a litmus test of whether you approve of Uber’s corporate culture, etc. Hopefully we get past that and are able to deal with both topics separately.
Have all the companies released numbers measured in the same way, numbers that the Times should have used instead? (I’d have checked the Waymo CEO’s statement, but I couldn’t find it. Do you have a link for it?)
I think that’s a key thing government regulators should be doing: requiring all the players to measure their accomplishments in the same way.
Of course, even if the Times used some bad data to show Uber was way behind, that doesn’t mean they aren’t in fact way behind. It just means maybe we don’t know yet, absent better data.
I agree that there should be a standardized and mandatory reporting methodology for safety metrics. Currently there isn’t at the national level. Some states require reporting but the definitions are probably too ambiguous for self-reported data. These will hopefully evolve and be fleshed out such that we can perform meaningful analysis.
For example, urban driving is far more difficult and naturally has a lower miles between disengagement figure. Safety numbers should probably be reported in a minimum of two categories, controlled access roads and non-controlled access roads.
Another difficulty in comparisons is that the different brands are operating in different areas. Weather, upkeep of lane markings, average speed, lane width, sight-line obstructions and turn radius at intersections differ drastically between those regions. That’s not an example of the stats being gamed but it does demonstrate the difficulty if attempting to impose safety standards.
The NYT article presents the Uber and Waymo paragraphs back to back, suggesting that the stats are comparable. But the Waymo stats are for safety-related disengagements, situations when the driver takes over to prevent an accident. They don’t include situations where the vehicle gets confused by something like construction and needs the safety driver to take over even though there’s no immediate danger.
I can’t find the CEO quotes but here’s what Waymo had to say:
“This report covers disengagements following the California DMV definition, which means “a deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.”
As part of testing , our cars switch in and out of autonomous mode many times a day. These
disengagements number in the many thousands on an annual basis though the vast majority are considered routine and not related to safety. Safety is our highest priority and Waymo test drivers are trained to take manual control in a multitude of situations, not only when safe operation “requires” that they do so. Our drivers err on the side of caution and take manual control if they have any doubt about the safety of continuing in autonomous mode (for example, due to the behavior of the SOC or any other vehicle, pedestrian, or cyclist nearby), or in situations where other concerns may warrant manual control, such as improving ride comfort or smoothing traffic flow. ”
There is 29 minutes of audio from the 1A radio discussion of autonomous cars. I didn’t hear anything particularly new. An amusing note. One call-in comment, Pat from Pittsburgh: “I have every day interactions with the robot cars that are crawling all over the city in our neighborhood. Every time I can I flip off the LIDAR just to see if it gets in the algorithm”. https://the1a.org/shows/2018-03-29/is-it-time-to-tap-the-brakes-on-self-driving-cars
I posted this in its own thread, but should mention here that Bike Pittsburgh was referenced in a national (Citilab) story on the pedestrian fatality in Arizona:
I was just listening (12:45 p.m.) to WESA and a story came on about the pedestrian who was killed by an Uber vehicle in self-driving mode in Arizona. About halfway through the story, they mentioned that Bike Pittsburgh had conducted a survey asking whether cyclists felt safe sharing the roads with self-driving cars. They mentioned the results, and made a comment or two about the findings. Nice national mention there, Bike Pittsburgh!
Erika Beras may have a new employer but is still based in Pittsburgh. IIRC she gets on two wheels from time to time, too. I don’t know if she follows the message board that closely, but it’s reasonable to assume that anything that makes biking in Pgh more pleasurable may have the benefit of being reported on at the national level.
Bike Snob NYC wrote about autonomous cars:
He says “as a cyclist I’ve long been leery of this “I, for one, welcome our new self-driving overlords” attitude. It’s not that I’m a technophobe, it’s just that I’m an automobophobe. At no point during my own lifetime or indeed the entire century-and-a-quarter of automotive history have cars or the companies that make them given us any reason to trust them.”
“… various companies are developing “bicycle-to-vehicle communications.” … Helmet laws will seem positively quaint once you’re legally required to use a GPS suppository.”
Uber acquires bike-share startup JUMP
Source says final price close to $200 million
Meanwhile, becoming a top urban mobility platform is part of Uber’s ultimate vision, Khosrowshahi told TechCrunch over the phone. As more people live in cities, there will need to be a broader array of mobility options that work for both customers and cities, he said.
“We see the Uber app as moving from just being about car sharing and car hailing to really helping the consumer get from A to B int he most affordable, most dependable, most convenient way,” Khosrowshahi said. “And we think e-bikes are just a spectacularly great product.”
The Uber detected the pedestrian, but ignored it.
This is how Big Oil will die
Uber shutting down AZ autonomous testing
…and Peduto is unhappy about their plans to start again here.
throwing it down
You never responded to our requirements. You never informed us of today’s announcement. You never followed up on my requirements after fatality in Arizona. Your PA lobbyist has ignored everything & instead has reached out to other electeds to cover your mistakes. Time to change! https://t.co/dIAtob9Z8O
— bill peduto (@billpeduto) May 23, 2018
…shutting down in AZ since someone was killed…but starting back up in Pgh et alia…until someone is killed in those places???
…driverless uber alles…
I guess is the long and the short is that PA regulates this, not the city of pgh. So peduto can be angry and make demands but Uber can do whatever it wants. Though I thought that this was the “new” uber that was attempting to play nice…
From that Ars Technica article:
“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.”
WOW! That’s not artificial intelligence, that’s real recklessness and stupidity on Uber’s part! Peduto’s demand that Uber limit autonomous testing to 25 mph or less, for now, is a good idea.
The Post-Gazette’s story is similar, but with a few more details:
Arizona Uber crash driver was ‘watching TV’
- This reply was modified 1 year, 4 months ago by Marko82.
Why you have (probably) already bought your last car
Autonomous cars don’t have to be a panacea:
Number of cars on the road rises because cheaper, easier transportation enables people to live farther from work.
Roads become more dangerous because autonomous vehicle software & hardware is poorly regulated. Reckless bootleg AV systems abound. Grand Theft Auto jumps to real life.
Waymo underreported crashes (unfortunately the source uses the A word)
Why do self driving cars get into a lot of rear-end crashes?
Self-driving cars basically do things that human drivers don’t, namely follow the rules of the road. They don’t speed, they slow down and stop for yellow lights, they yield to pedestrians, etc.
The number and type of accidents these things get into is really good evidence, IMO, regarding how poorly drivers follow the law and how the expectation is actually to break the law. In short, it’s not just cyclists.
You must be logged in to reply to this topic. Click here to login.