Self-driving Uber car killed a pedestrian

Status
Not open for further replies.
Since there is a lot of money at stake in these automated cars the lawyers are going to have a job on their hands discrediting the character of this woman to protect their investor's money. Even if they can't bring it into court they have to put it out there in the public eye. They don't want the public thinking that these heartless machines are capable of running people down because of a glitch in the system. Sad as it may be that this woman was killed by this automated car with a zombie behind the wheel as the backup is nothing more than inconvenient to the "money people". After the case is settled their though will be on the increase in insurance and not the family of the lady that died or grounding the fleet until the problem is patched up.
 
Originally Posted By: OneEyeJack
Since there is a lot of money at stake in these automated cars the lawyers are going to have a job on their hands discrediting the character of this woman to protect their investor's money........

Trav's DailyMail link above showed this lady's mug shots from 6 trips to jail for drug use in the past. It seems mean, cruel, and unfair to impugn her now. Mainly the direct evidence (witnesses, dashcams, event recording in the computers, and other computer evidence logged inside), might be the only permissible evidence in a courtroom, not whether or not the lady was prone to impulsivity caused by previous drug usage.
 
Last edited:
It's a standard Daily Mail coverage, I don't think they intended anything but a juicy story to sell more copies of their rag.
 
Originally Posted By: Alfred_B
It's a standard Daily Mail coverage, I don't think they intended anything but a juicy story to sell more copies of their rag.


Wouldn't you think the story would've been even more "juicy" if they dug out Ubers past shenanigant behaviour instead of some unknown women?

Selling more copies or clicks is clearly not the factor here.
 
Originally Posted By: mightymousetech
https://www.sfchronicle.com/business/art...#photo-15258163

It's certainly looking more and more like its the lady's fault. I know some homeless people in urban areas have darted out in front of me, but I've always had time to hit the brakes hard and avoid.

About blaming the hardware & software, I do wonder what scrutiny the authorities give to memory management, compiler reliability, "V & V" testing, sensor redundancy, wiring integrity, on-board diagnostic reliablity, self-test features, Monte Carlo simulations, etc. ... Same as what's been happening in the aviation world for 40 years now. This isn't new really. New to cars yes, in some narrow ways I guess.

Not really different than failing airbags, ABS, stability control systems, brakes, steering, etc. we all rely on in vehicles to avoid hitting pedestrians or having other crashes.
 
Without knowing what actually happened, I've had a homeless guy on a bike jump at me while travelling from the opposite direction on the sidewalk. I was going slow and slammed the brakes; after I stopped, he ajusted course and didn't hit me.
Then there are all the moron kids travelling on their skateboards while their eyes are glued down to their phones.
 
These are not anomalies on the road and a self driving car should be designed to handle them. Quite strange that this Uber car wasn't designed for it.
 
Originally Posted By: Alfred_B
These are not anomalies on the road and a self driving car should be designed to handle them. Quite strange that this Uber car wasn't designed for it.


How can you say that when we have no idea what exactly happened?
 
Originally Posted By: Alfred_B
These are not anomalies on the road and a self driving car should be designed to handle them. Quite strange that this Uber car wasn't designed for it.

I think what has to happen is a government standard autonomous system driving test. A system has to pass dozens of tests to be certified for use on a public road. Also the system needs an accident recording system to verify what it did before the accident was in compliance with the testing. If it failed to work right, manufacturer gets sued. Failed because of poor maintenance or improper driver override, driver/insurance gets sued.
Right now I suspect these systems are hesitant to slow down and go into a potential avoidance mode because it makes the system seem "jumpy" or nervous. If I see someone on a bike approaching the road perpendicular at a high rate of speed even 100' away I will atleast get ready to do something if needed. If its a kid dragging their feet and looking scared then obviously I would stop or slow long before they got in front of me. I'd guess right now, an automated system does a bad job of identifying when something coming up looks wrong, especially approaching the road perpendicularly. Or it can't recognize when it doesn't have enough visual cues to maintain its current speed safely. Sure lots of people drive less safely than is optimal too of course, but they can't be engineered to be safe where an automated system can.
 
Originally Posted By: mightymousetech
Originally Posted By: Alfred_B
These are not anomalies on the road and a self driving car should be designed to handle them. Quite strange that this Uber car wasn't designed for it.


How can you say that when we have no idea what exactly happened?


Pedestrians on the road are not anomalies, including the ones that appear out of nowhere.

The only time this is acceptable if the car makes a decision to run over the fewest number of people when choosing between alternatives.
 
Originally Posted By: Alfred_B
These are not anomalies on the road and a self driving car should be designed to handle them. Quite strange that this Uber car wasn't designed for it.
We humans can't handle this situation. Perpendicular movement, sudden, at night. You or I probably would have hit her. The last article linked has some more clues as to what happened. I don't think we can expect a robot-car to outperform us.

I do appreciate that humans have an extra ability to recognize unusual situations and adjust or stop.

What is common is when the road ahead has detour cones and cop cars ahead (accident scene). We have to thread our way thru the detour path. Can a self-driving robot car do that too? Not sure. I assume they have tried it.

I've wondered what happens when mud or insects gets splattered on the forward and side sensors. Then what?
 
Originally Posted By: Alfred_B
Here's the Russian version of Google their self driving car in Moscow. It seems to acknowledge road hazards like pedestrians https://youtu.be/Bx08yRsR9ow
Cool video. (Looks like the Russians are good at stealing Carnegie-Mellon university research from the DARPA challenge.)
There is a forward field-of-view for these obstacle detectors for any self-driving car. 160 degrees would do it. Pedestrian detection can be defeated if the movement is sudden enough. An investigation will reveal if there is a weakness. Guinea pigs on the streets will test it all for us....
 
The truth is, any of these new technology will have a period of testing / failing to go through. People will die during the teething problems, like the early days of airplane and railroad. The hope is even with these casualty it will be much lower fatality than the mature human drivers statistics.
 
I'd like to see how these self driving cars do in a rain down pour or a heavy snow storm. Are the sensors still going to work well enough?

Should also be interesting to see how they react to crazy drivers around them.
 
These things are clearly not ready for prime time just yet and may not be during our lifetimes.
It'll be some time before both the states and the feds allow autonomous cars on the public roads as any more than test articles, and even that status will now be seriously questioned following the fatal collision with this pedestrian.
Autonomous systems in random environments just aren't all that good.
The dumbest driver is smarter and more capable than the best autonomous system, whatever EM may think.
 
Originally Posted By: ZeeOSix
I'd like to see how these self driving cars do in a rain down pour or a heavy snow storm. Are the sensors still going to work well enough?
Should also be interesting to see how they react to crazy drivers around them.

I asked that same question a while back on a Tesla thread here. Somebody wrote back saying it works OK.
https://electrek.co/2016/12/28/tesla-autopilot-snow/
 
Originally Posted By: fdcg27
These things are clearly not ready for prime time just yet and may not be during our lifetimes.
It'll be some time before both the states and the feds allow autonomous cars on the public roads as any more than test articles, and even that status will now be seriously questioned following the fatal collision with this pedestrian.
Autonomous systems in random environments just aren't all that good.
The dumbest driver is smarter and more capable than the best autonomous system, whatever EM may think.


According to the report so far, there was a driver behind the wheel and the lady basically jumped out into the middle of the road. That type of accident happens quite often. There were about 4500 pedestrians that are killed every year by cars. That number will probably go down once autonomous cars are everywhere, but there's going to be certain accidents like this type that are unavoidable unless you limit yourself to 10 mph down every road.

The dumbest driver on the road is probably the drunk one or the one high on pot.
 
Status
Not open for further replies.
Back
Top