Self-Driving Uber Kills Pedestrian
We don't yet have enough information to assign blame here. Naturally, that's not stopping anyone.
FT (“Self-driving cars under scrutiny after pedestrian death“):
Car accidents kill more than 1m people every year around the world. In the US, one person dies for about every 100m miles driven and on any given day, an average of 16 pedestrians die on American roads.
Proponents of self-driving cars point to these figures as a justification for their multibillion-dollar efforts to replace flawed and easily distracted human drivers with fully automated computer systems. Yet even the most confident of engineers would have been forced to admit that one day a robot car would probably end up killing a person.
That day arrived much sooner than the automotive and technology industries reckoned. On Sunday night, an Uber test vehicle with a human behind the wheel but under the control of its autonomous systems, killed a pedestrian as she was crossing the road in Tempe, Arizona.
Few of the dozens of tech companies, carmakers and start-ups working on autonomous systems have commented publicly since the incident but many privately worry that the first pedestrian death caused by a self-driving car will undermine — at least in the court of public opinion — their efforts to build what they see as a safer alternative.
“Thanks Silicon Valley, you just set us back at least a decade,” fumed one entrepreneur working in the sector.
Uber has said its self-driving vehicles have together driven more than 3m autonomous miles to date. Since the start of its first passenger-carrying pilot programme in Pittsburgh in September 2016, Uber has expanded its North American fleet to more than 200 vehicles equipped with an extensive array of cameras, sensors, mapping equipment and navigation systems.
Others have gone further. Alphabet-owned Waymo recently surpassed 5m autonomous road miles, with millions more in computer simulations. Nonetheless, the industry’s accumulated real-world experience falls far short of 100m miles — a symbolic milestone many developers of self-driving cars were quietly hoping would have been reached before any fatalities.
“The fact that this has happened well in advance of 100m miles does not tell us anything statistically,” said Bryant Walker Smith, assistant professor at the University of South Carolina’s law school and a legal expert on autonomous vehicles. “But it is early, particularly in light of everything that these systems already have going for them.”
Unlike most regular cars, autonomous vehicles are well maintained and closely supervised by their operators, he said. “That should stack the deck in the favour of these systems.”
Tempe police say Uber’s Volvo was travelling at about 40 miles per hour and did not slow before hitting 49-year-old Elaine Herzberg as she stepped into the road, pushing a bicycle. Uber’s human driver told investigators that his “first alert to the collision was the sound of the collision”, Tempe police chief Sylvia Moir told the San Francisco Chronicle.
It will fall to the county attorney’s office in Maricopa, Arizona, to determine who was at fault and whether to press charges against Uber or its driver. But other agencies are also poring over the incident.
Two US federal safety regulators, the National Transportation Safety Board and the National Highway Traffic Safety Administration, have sent their own investigators to Tempe. California’s Department of Motor Vehicles, which oversees autonomous testing in Uber’s home state, is also seeking information from the company about what happened.
Legal experts say their lines of inquiry are likely to focus on whether a faulty sensor or other system failure contributed to the accident; whether the car “saw” the pedestrian and how that person behaved; whether the automated driving system should or could have handed control to the human behind the wheel; and what kind of evasive action it took.
Any fatality is tragic but we don’t yet have enough information to assign blame here. Naturally, that’s not stopping anyone.
NYT (“Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam“):
Arizona officials saw opportunity when Uber and other companies began testing driverless cars a few years ago. Promising to keep oversight light, they invited the companies to test their robotic vehicles on the state’s roads.
Then on Sunday night, an autonomous car operated by Uber — and with an emergency backup driver behind the wheel — struck and killed a woman on a street in Tempe, Ariz. It was believed to be the first pedestrian death associated with self-driving technology. The company quickly suspended testing in Tempe as well as in Pittsburgh, San Francisco and Toronto.
The accident was a reminder that self-driving technology is still in the experimental stage, and governments are still trying to figure out how to regulate it.
Uber, Waymo and a long list of tech companies and automakers have begun to expand testing of their self-driving vehicles in cities around the country. The companies say the cars will be safer than regular cars simply because they take easily distracted humans out of the driving equation. But the technology is still only about a decade old, and just now starting to experience the unpredictable situations that drivers can face.
It was not yet clear if the crash in Arizona will lead other companies or state regulators to slow the rollout of self-driving vehicles on public roads.
Much of the testing of autonomous cars has taken place in a piecemeal regulatory environment. Some states, like Arizona, have taken a lenient approach to regulation. Arizona officials wanted to lure companies working on self-driving technology out of neighboring California, where regulators had been less receptive.
But regulators in California and elsewhere have become more accommodating lately. In April, California is expected to follow Arizona’s lead and allow companies to test cars without a person in the driver’s seat.
Federal policymakers have also considered a lighter touch. A Senate bill, if passed, would free autonomous-car makers from some existing safety standards and pre-empt states from creating their own vehicle safety laws. Similar legislation has been passed in the House. The Senate version has passed a committee vote but hasn’t reached a full floor vote.
“This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads,” said Senator Richard Blumenthal, Democrat of Connecticut.
NPR (“Arizona Governor Helped Make State ‘Wild West’ For Driverless Cars“):
Arizona Gov. Doug Ducey began a push three years ago to attract makers of self-driving cars to the state and actively wooed Uber away from California as a venue for testing those vehicles.
Shortly after his election in 2015, the governor signed an executive order supporting the testing and operation of self-driving vehicles that he said was about “innovation, economic growth, and most importantly, public safety.”
Now the “public safety” part of that order has been thrown into question and Arizona’s willingness to become a testing ground for emerging driverless vehicles has come into sharp focus after Sunday’s incident in which a self-driving Volvo SUV operated by Uber struck and killed a 49-year-old woman who was walking her bicycle in Tempe.
As we reported Monday, the car was in autonomous mode at the time of the crash, around 10 p.m., but had a human riding in the passenger seat to take control if necessary. The incident is now under investigation.
The year after Ducey’s executive order, Arizona seized on a chance to steal a bit of driverless thunder from Silicon Valley. California had just ordered Uber to stop testing its autonomous vehicles on the streets of San Francisco until it obtained the proper testing permits. Uber balked and Arizona stepped in.
“Arizona welcomes Uber self-driving cars with open arms and wide open roads,” Ducey said in a statement in December 2016.
“While California puts the brakes on innovation and change with more bureaucracy and more regulation, Arizona is paving the way for new technology and new businesses,” the Republican governor said.
For its part, Uber pledged to expand its self-driving pilot program in Arizona and said it was “excited to have the support of Governor Ducey.”
Arizona has also welcomed other companies working on driverless technology, leading The Arizona Republic to proclaim last year that “With major testing by Waymo, Uber, General Motors, Ford and Intel, Arizona is more than holding its own in the race to attract the self-driving car industry.”
Uber is, quite naturally, suspending testing until more information is known.
One fatality is unlikely to change many minds. Instead, it’ll likely reinforce pre-existing beliefs about the technology.
UPDATE: Eric Paul Dennis makes a plausible argument that the real issue isn’t driverless vs. drivered car but our poor accommodation for pedestrians in traffic design.
https://twitter.com/EricPaulDennis/status/975889922413551616
https://twitter.com/EricPaulDennis/status/975896990214164480
https://twitter.com/EricPaulDennis/status/975898068976533504
All of this causes the lawyer in me to ponder the question of how we’ll handle issues such as liability for personal or property damage in the case of autonomous vehicles. Who gets the blame when a vehicle controlled by computers causes an accident, or when its involved in one in a state with a so-called “no-fault” insurance system? And who will be required to maintain insurance on the vehicle?
Obviously, the passenger(s) in the vehicle can’t be held liable unless it can be shown that they interfered with the operation of the vehicle in some way, but how do you hold an algorithm, or the company that created it, liable?
People driving cars are a danger to pedestrians. I was struck by a car late last year while out on my morning walk. The 28-year-old behind the wheel didn’t see me as I crossed straight in front of her. Aren’t you supposed to look in front of you when driving. I’m not a small person and I was about even with the middle of her car hood. She lurched forward, I banged my hands on her car hood to get the moron’s attention, then she lurched forward again striking me and propelling me into Lantana road flat on my back. Fortunate for me, no cars were coming at the time.
Before that I had two instances of drivers bumping me with their cars in similar fashion but without injury to me. You know what happened in those cases? The drivers took off.
Drivers pull into pedestrian cross walks or past the white line frequently and with little concern for pedestrians out there.
The articles don’t answer some basic questions. Did the vehicle have video recording? Did the police impound the vehicle or does Uber have it? Where on the vehicle was the impact? It may be that the victim suddenly veered or tripped into the path of the car. As James notes, we don’t have enough information to make judgments. But some general observations are fair.
I fail to see the difficulty of holding the company that created code liable if the code fails to behave as promised. As far as I’m aware we have no established standards, government or industry, for autonomous vehicle performance. Perhaps we should, soon. Not hitting stationary or slow moving objects would seem a pretty minimal expectation. Moving over in the lane to increase margin from a pedestrian wouldn’t seem much more demanding.
We will be seeing the AV companies lobbying for changes to roads to accommodate them. You can already see it in this incident. Locally we can’t get money to replace a worn out bridge carrying multiples of the traffic it was designed for. Heck, we can’t even keep up with pot hole repair. If Uber wants our crosswalks redesigned, Uber better pony up the money.
Of course I’d like to know more details of what happened, but the clues so far indicate the car didn’t even try to slow down. If that’s true, it doesn’t really matter what the pedestrian did – the AV definitely failed. It should be able to react to anything that suddenly jumps into the road. Deer don’t usually use crosswalks either.
And a single sensor failure doesn’t explain it. Any properly designed system would have redundancies, and if any critical sensor wasn’t working the vehicle should either force human intervention or simply shut down.
@gVOR08:
I can answer that question: yes.
All that video is logged and tied to other car data.
Through professional connections I know people working on autonymous vehicle research at a couple major firms. Every vehicle rolling today is constantly recording (both for unfortunate situations like this, and to record the performance of a car, and also just to capture additional data about the behavior of other cars and pedestrians).
To the broader point, the companies (and their research teams) are studying pedestrian behavior (including how people signal to do things like cross the road) as much as they are studying traffic patterns. Late last year, I attended a presentation where researchers shared some of the video they have of pedestrian behaviors from across the world — as anyone who has travelled knows, the way people on foot interact with traffic varies hugely from locale to locale.
Eric Paul Dennis is completely right that city design plays into these issues as well.
@Doug Mataconis most of the major transportation companies have had legal teams gaming out the questions you are raising for at least the last five years (and ramping up as these vehicles get on the road). I also believe there is now some existing precedent (in states like California) from accidents that happened during early tests.
Mr. Dennis is correct – this poor pedestrian was doomed to be hit by *somebody* with that kind of layout. It’s just bad luck it happened to be car testing out new tech instead of your typical careless driver. That people are blaming the tech for not being perfect as opposed to the roads we all have to share goes to prove the public would rather have imperfect humans killing people left and right with their stupidity then a car with imperfect code or malfunctioning equipment.
The inevitable lawsuit should be against whomever designed that intersection, not Uber.
@KM: AV developers are using some of my code this very minute. I firmly believe AVs will be far safer than human drivers in the long run. But like I said above, I’m concerned about the sentence “Tempe police say Uber’s Volvo was travelling at about 40 miles per hour and did not slow before hitting 49-year-old Elaine Herzberg”. Detecting objects moving into the car’s path is the first goal. There are terrible intersections and jaywalkers and animal crossings everywhere in the world, an AV *must* be able to handle this very basic situation.
Unless I learn the above statement was incorrect, Uber’s developers failed badly. Perhaps they should let more ethical companies work on getting the basics right before attempting to profit wildly.
@Franklin Yeah, the “did not slow” thing catches my attention, too. I wonder if that isn’t the judgement of an onlooker as opposed to the recordings in the vehicle. Also “pushing a bicycle” seems to me to be a mode where the vehicle might have difficulty predicting both the near future trajectory and speed of the pedestrian/cyclist. How to treat this combination, which I assume is rare?
@Franklin :
I second @Jay’s question where shape detection seems to have an issue. Does “detecting objects moving into the car’s path” mean “anything” or just what is programmed to be recognized as a hazard? Personally, I want the car to slow or stop if it recognizes there’s any sort of blockage up ahead but that may not be how the developer’s chose to proceed – after all, there will almost always be a car ahead of you so you can’t just say “something up ahead in x feet, stop!”.
Can you comment further on how the logic might work?
@Jay L Gischer:
I can’t speak for Tempe, Arizona but it’s incredibly common in these parts. And, frankly, it’s one that’s difficult for human drivers—at least this one–to predict future behavior. Bicyclists dismount all the time, for reasons I don’t understand, and walk their bikes across intersections. But when they’re going to change from standing with the bike to suddenly jumping into the road is much less obvious than with someone whose legs aren’t obscured by a bike.
@Doug Mataconis: I would expect that the vehicle’s owner would have to carry insurance, just like every other vehicle, and that the safety record of a particular auto driving technology would be priced into that insurance.
And, if there is a defect that kills many people, victims families will sue the manufacturers, just like the recent airbags of death.
Vehicular manslaughter will likely go way down, though, unless we discover a programmer decided to kill people, or a company ignored safety issues.
I don’t see any reason why this will require an entirely new way of thinking about liability.
I do wonder who will reimburse owners when it is discovered that the Uber Autodrive 10000 kills people in grey sweaters on rainy days, and there is no update since the company folded, and they are all taken off the road. Oh, right, no one.
@Jay L Gischer: “Weird object in my path” should have some kind of behavior other than “keep on going”. Deer, dog, person walking a bicycle, small child, yeti… none of these behave normally, and many of them happen regularly.
If a bicyclist crosses your path as you’re driving, you (a human driver) slow down, even if you estimate that you wouldn’t hit them if you both continued at your current speed. All sorts of things can happen — they fall over because they are drunk, etc. — and you want to be able to stop if you need to.
Assume the car is roughly that smart. It probably failed to see they were there at all, not failed to predict their speed.
James, I agree with your original sentiment. We don’t know yet what happened exactly, but in this case we will definitely find out. There will be video from multiple cameras and the details of how the car reacted or didn’t react will be recorded to the millisecond. Speculation, including by the cops (how do they know the vehicle did not attempt to slow down?) is unnecessary. That said, I think there are two things that contribute to the pre-facts speculative frenzy. First, this is Uber, and Uber has shown itself to be a dishonest, reckless and basically sleazy company. Second, the governor of the state made a big deal about how he wanted the AV industry to come to town and he would take care of any regulations they felt inhibited them. It came across more than a little bit as “Screw those pedestrian takers, let the makers do what they want!” I remember just how quickly Uber decamped to this new AV wild west and thought at the time that it was a bad sign. Regardless of the outcome of this investigation, I still think it hurts everyone in the long run to try to attract industry by wink-wink-nudge-nudge don’t-you-worry-your-pretty-little-head-about-those-itty-bitty-rules-and-regulations.
@Gustopher:
There’s an old rule for a person or animal in front of you on the road. Brake hard to stop short, but if you can’t, aim right at them until they commit L or R, then ease off the brake enough to turn and go the other way. I wonder how AVs do it?
@Franklin: Can an AV distinguish between a stationary human and, say, a roadside electrical cabinet? (And do they know not to run through puddles if there are pedestrians on an adjacent sidewalk?)
@MarkedMan:
That’s why I’d like to know if Uber or the cops have possession of the car, which is evidence in a possible manslaughter. Although I’d have the same concern with Microsoft or GM.
I highly recommend that those who are interested in this topic take time to scan this recently published article at the New York Times:
https://www.nytimes.com/2018/03/19/technology/how-driverless-cars-work.html
Listen to this episode of RadioLab:
http://www.radiolab.org/story/driverless-dilemma/
I think both will help with some of the questions coming up (and probably raise new questions).
@gVOR08:
I upvoted you for the concern about Uber or the cops having possession of the car. But I don’t have the concerns about GM and Microsoft that you do. This type of coverup would be very difficult in a normal large company, because it would involve so many people, especially ones at a low level. The crap they get away with usually has a handful of chief execs altering a policy, with underlings just following procedure and not aware of the shadiness. When a lot of low level people are actually involved in the shenanigans, the secrecy basically falls to pieces the first time investigators start poking around. Look at Wells Fargo.
If you are a low or mid-level person at a company and some big exec comes in and starts hinting that they would be grateful if you could make a problem go away, know that the way it plays out is this: You will falsify testimony. The executive will never follow through on any implied promises of furthering your career because if you are caught they don’t want to be associated with you. And if you are shown to have lied, the company will immediately fire you and not offer any legal help whatsoever, because if they do, they are wrecking their story: i.e. that you are a lone bad actor and they would have put a stop to it if they had only known. Worse, since they know you are definitely going down, so they will heap any bad actions on you that they can, getting as much trash out the door in one go as they can.
Further investigation revealed that citizen Gowin had already entered the intersection crossing in front of two vehicles in two eastbound lanes stopped for a red light. As Gowin continued forward the light changed to green. All the driver of the SUV in the third eastbound lane approaching the intersection saw was the light change from red to green so he accelerated. The SUV driver did not see Gowin sitting in his wheelchair emerging from in front of the other two vehicles.
A day or so later I heard a woman say: “I saw that accident. The car went five feet in the air!”
As tragic as that accident was, there was no crosswalk at that intersection at the time.
Mr. Gowan was jay-walking.
The update to this story indicates that in all odds even a non self driving vehicle would not have been able to avoid hitting this pedestrian who out of the blue stepped in front of traffic. The vehicle was only going 38 in a 35 mph zone (so not really speeding, even if you want to be pendantic and say anything over 35 is speeding) and the car’s programming and human occupant literally had no warning/time to allow for the brakes to be depressed to reduce the speed even a little bit.
A really sad story but oddly enough if allowed I would be happy to step into a driverless car right this second because I do not blame the computer in the car for this tragedy.
@MarkedMan: I would mostly agree, but the VW diesel scam argues otherwise.
For those who didn’t follow it, VW basically programmed the engine to recognize that it was being run through a government test cycle and run cleaner than usual until the test was over.
@gVOR08: Good point, and it does give me pause. Although I could argue that once the feds started investigating, the thing fell apart in almost no time. But I should always remind myself that there is no such thing as unbelievably stupid in real life.
@Mister Bluster:..If the driver who killed Gowin could not see the victim in his wheelchair in this situation is there any hope that an autonomous vehicle would have detected the danger?
Fifty year old anecdote from my seventy year old brain.
I still trust my memory most of the time. You don’t have to.
As I recall this tragedy I have to wonder, would an autonomous vehicle traveling on this side street sometime in the future at the legal speed limit of 30 MPH have been able to stop in time to prevent the killing of another innocent child running out from between parked cars?
@Mister Bluster: I still remember the sick feeling from 27 years ago when I was driving down a city street with crowded street parking on both sides. I was lost in thought and suddenly realized I had slammed on my brakes. Just before the car lurched to a halt a little boy, maybe two and a half, ran from between two parked cars and in front of mine. I stopped maybe six inches from his tiny shoulder. My reaction was so automatic that I had to reprocess what happened and realized that I had slammed on the brakes because of a peripheral awareness of a young couple walking out of their house with that kid between them. All of a sudden he laughed and ran and I couldn’t see where he had gone and thank god my pre-concious reaction was to hit the brakes. If I had been looking at the radio, or my rear view mirror, it would have turned out differently.
So if AV can ever get to the point where they could see and react to something like that, and never be distracted, then I am all in. In fact, if they get to that point, I don’t think it will be long before juries look on people who drove their own car and got into an accident the same way as someone today who drives a souped up car.
@MarkedMan:
Its worth noting that there’s a visible future where AV not only can get to the point where it’s possible to deal with that situation, the changes brought about by AV’s would lead to scenarios where it will be less and less likely to have the street conditions you mentioned (parked cars lined, bumper to bumper, on each side of the street).
We’re already seeing some really interesting changes to parking due to the increase of options like Uber and Lyft. I only expect those to accelerate as AVs become increasingly available and using one continues to cost less and less.
I never rely on supernatural intervention for anything.
I don’t see any evidence that it exists.
@MarkedMan: I had the same feeling years ago. Driving I71 to the airport I found myself half way onto the left shoulder going. ‘why am I doing this?’ Looked right just in time to see a Saab who had drifted into my lane bang her side mirror into mine. She looked more scared than I did as she jerked back into her lane.
Atrios has a thing about self driving cars. He doesn’t think they’ll ever work. I comment occasionally saying they can basically be declared to work. I think he nailed it late last night, Operation Blame the Pedestrian Begins.
Bicycle riders seem unable to follow the rules and oblivious to what a nuisance they sometimes are. Hell, they enjoy it. But I deal with them cautiously and courteously anyway. I’ve only ever brushed back one bicyclist. He was in the regular traffic lane next to an empty bicycle lane.
The rich asshats who run Uber are going to expect the rest of us to put up with a lot, and pay a lot of money, to accommodate their new money losing business model. As they evolve AVs will get to where they work well and be wonderful for all of us. But they’ll be on our streets for years before that day.
@Bill: It’s funny you say that because my first reaction to this article was to think of all the times I’ve had pedestrians straight up walk out in front of my car +50 yards away from anything resembling a crosswalk while I was still going the speed limit. I swear pedestrians down here are suicidal. They will straight up just cross a road wherever and if there’s cars using that road they don’t seem to care. That is unless you drive past them too close then they flip out. So naturally my first thought was “good one less moron”… which is pretty awful sounding. As a single digit aged kid I had to cross the busiest street in my town (40 mph speed limit most went +50) just to go to T-ball practice and school. Never once did I come anywhere close to causing a problem for any of the drivers. It’s not fcking hard and I have no real sympathy for idiots who just cross wherever without looking because they think they are so special everyone should have to stop for them. Second on my list are bicyclists who want you to respect them as part of the traffic and then they ignore the rules of the road as it’s convenient (like blowing through a red light).
@gVOR08: Five seconds with google and….
The pedestrian is at fault in this case.
@Mister Bluster:
There’s no hope because in that situation Gowin was being a complete moron and asshole by crossing at the time he did. Anyone with half a brain would of acknowledged the lights were changing and waited. If there had been a pedestrian light it would of been a red hand well before Gowin decided to cross. I’ve seen those kinds of idiots too who believe that since they are in a motorized chair they can just cross in front of cars with impunity regardless of the lack of crosswalks etc.
@Mister Bluster: An autonomous car would of reacted much quicker and possibly saved the kid’s life. What would of definitely saved the kids life is if the parents would of been paying attention and not let their 2 year old run around in traffic…
EDIT: The street I crossed as a kid had NO stop signs or stop lights. There wasn’t even a pedestrian crossing marked.
@Matt:.. I’ve seen those kinds of idiots too who believe that since they are in a motorized chair they can just cross in front of cars with impunity regardless of the lack of crosswalks etc.
I’ve seen a lot more able bodied idiots up on two legs cross in front of cars with impunity regardless of the lack of crosswalks…
Haven’t hit one yet.
@Matt:
So it’ll be OK for AVs to kill them? Which isn’t as flippant as it sounds. The makers will deploy AVs when they calculate the profits exceed the likerly liability. If they can hit pedestrians who are in the wrong without financial consequences, they may not be properly motivated to avoid doing so. I’m expected to avoid hitting people even if they’re in the wrong. It seems a minimal expectation for AVs.
We know where this is going, civil court. There’s deep pockets there and that means litigation. And Uber doesn’t have a way out as they designed and own the car and the passenger was an employee.
Of course, people like Uber and Google, as well as this governor, will attempt to remove liability from AUVs….in the interest of progress. The autonomous guys are always offering up the solution of just telling people tough and that they have to adapt to the “new” technology even it kills them. They seem clueless about how our litigious society works. Look at what they did to Toyota and fly-by-wire acceleration.
@JKB:
There was no fly-by-wire acceleration. Almost every single accident in the “spree” was caused by drivers, mostly in unfamiliar cars, with their foot on the accelerator convinced they were on the brake pedal.
@gVOR08: To be honest? Yeah I’m fine with it. It’s not hard to properly cross a street or to pay attention.
The problem is you can’t just single out to punish AV for doing something that human drivers do everyday. This person crossed the street in the wrong place in a very unsafe manner and the sensors didn’t pick them up in time. Same thing would of occurred if a human was behind the wheel. Proof of that is in the fact that the passenger didn’t see the woman quick enough to over ride the AI.
@JKB: Electronic throttle control (what you call “fly-by-wire acceleration) was a thing long before Toyota experienced issues.
@Mister Bluster: Congrats you’ve been lucky and statistically you’ll probably keep that luck till your death.
I could balance out your claim by saying that I’ve been trying to run over jaywalkers for decades now and I have yet to hit one but really would you get the point of such a statement?
EDIT : Here’s a hint… personal anecdotes are statistically irrelevant. I drive 20-30k miles a year. You might drive 5 in an area with vastly different laws and road designs.
Atrios again has the right take on this.
Atrios points out that Uber claims 3 MILLION (Dr. Evil) miles of testing and now one fatality. That works out to 333 fatalities per billion vehicle miles. For U.S. vehicles it’s 12.5 per billion.
One might argue that’s just Uber. But Uber is developing their own unique cars, hardware and software. An equivalent of a type certificate for Microsoft had better not cover Uber. And it better be based on way, way more than 3 million miles of testing. I’ve driven over 20,000 miles a year for 50 years. That’s a MILLION miles without killing anyone.
Everyone seems to be assuming the victim behaved stupidly. PEOPLE DO THAT.
@James Joyner:
My point. Toyota paid even though the technology wasn’t the problem.
But Uber has a big problem. The video is out. Yes, tough for a visible light only driver, i.e., human. However, the AV supposedly has 360 radar, LIDAR and a camera array (presumably with low light/infrared capability). The victim was in the road, not obstructed by anything but shadow. An attentive human driver could have possibly braked or steered to lessen the impact when they came into the light. And, yes, you don’t slam on the brakes to avoid a deer as it just increases the risk of going out of control, but who would do that cold calculation for a human?
Basically, the autonomous vehicle, with all its sensors and technology, have achieved distracted human driver equivalence.
@KM: Sorry, didn’t get back to this thread until now.
The different types of sensors have different capabilities. A ‘normal’ camera just picks up color at every pixel. A LiDaR camera can pick up the distance at each pixel. Some sensors may be able to directly measure the relative speed at each pixel, but it’s usually calculated indirectly from the other data.
Following a car ahead of you that is going the same speed is going to show up as some blob with a relative speed of zero. This person pushing a bicycle should have appeared as some blob with a big incoming relative speed.
By the way, any developers putting cars on the road should be at the stage where they can group a bunch of pixels and at least consider it a single blob (‘object’ or ‘obstacle’ are more formal terms, but I like ‘blob’).
Yes, they are working on the next step which is *identifying* these blobs, with varying success. That gives some context as to what they might expect the blob to do next (or if in fact that blob is a dangerous heavy rock vs. a light chunk of snow that is falling off somebody’s car). While this context is important for the AV to act as intelligently as possible, I don’t know that it is considered critical at the current stage. In any case, it would be tough to argue that it is okay to hit *anything* as big as a person & bicycle without some sort of reaction.
BTW the video taken by the car during the accident is out.
https://www.youtube.com/watch?v=Cuo8eq9C3Ec
Are you really going to tell me that you would of reacted quicker as a driver? That section of roadway is a terrible place to cross and the lighting doesn’t help. That isn’t a crosswalk area at all.
The whole area has tons of automated cars driving around (waymo, uber and even Intel). It’s become a sort of game to spot the automated car and who owns it. You know like the slug bug thing when you were a kid.
@Matt: I might have had time to let my foot off the gas, something the AV apparently do. And it has enough sensors to spot that obstacle probably a hundred meters before I do.
Again, I say this as somebody who’s partly connected to the industry and who thinks AVs will absolutely be safer than humans. But somehow Uber completely blew this one, it’s indefensible. Technically they might not be liable since the person was doing something stupid, but their AV tech needs to be thrown out and bought from somebody who is more serious.
@Franklin: Where did you see the claim that the AV didn’t react at all?
Obviously uber and the others (waymo/intel) who operate in the area are already looking very carefully at what happened. Next time something like that happens the AV might actually be at fault which will result in a nice fat lawsuit.