Self-driving Uber car killed a pedestrian

Status
Not open for further replies.
there is a bill in congress to prevent victims from sueing when hit by an automatic car, you have to use arbitration which is secret.
and when you accept Uber rules, you agree to that now.
not good
 
The bicyclist could be at fault. You or I could hit a biker if they dart in front of us. We don't know the facts yet.
It will be determined if the car sensed the biker, and, if detected, whether the brakes were slammed down as we humans would do.
I know my reflexes to hit the brakes hard is good, since its been tested a few times. Yet, we are all capable of being slow to hit the brakes, and remember it takes a few milliseconds for a human brain to actuate the brake.
 
Originally Posted By: edwardh1
there is a bill in congress to prevent victims from sueing when hit by an automatic car, you have to use arbitration which is secret.


Complete bull dookey. Same liability should apply to a vehicle regardless of whether it is driven by a person, or self driven. The person on board has the responsibility.
 
Originally Posted By: edwardh1
there is a bill in congress to prevent victims from sueing when hit by an automatic car, you have to use arbitration which is secret.
and when you accept Uber rules, you agree to that now.
not good


That's crazy! I hope that bill goes down in flames. Whats next you cant sue bridge builders.
 
Quote:
there is a bill in congress to prevent victims from sueing when hit by an automatic car, you have to use arbitration which is secret.
and when you accept Uber rules, you agree to that now.
That can't be right. Maybe for the person inside the Uber, like if the passenger of a self-driving car that crashed ended up dying. If someone outside of the Uber who is not a passenger of the Uber gets hit, they're not bound to any terms they haven't agreed to.

As for this issue, I think all that's known right now is someone was hit and killed. We don't know yet for certain what the exact circumstances are. I read that she was crossing outside of a crosswalk, hypothetically if she launched herself out between parked cars at the side of a street, I'm not aware of any human or AI that can deal with that depending on the distance and speed. If she launched herself out one foot away from the front bumper for example, she's getting hit, it's not physically possible to react that fast or to stop a car that fast in that distance.

I still don't see anything that tells me this specific incident was less safe than a human driver. Now cases like the guy who drove right into the side of a truck with a Tesla? That one is less safe than someone with their hands on the wheel, paying attention. But then again in that case the Tesla wasn't as autonomous as this Uber probably was.
 
There are about 6,000 pedestrian deaths in the US each year.

Every day, 16 drivers hit and kill a pedestrian.

Using these statistics, this year's tally (so far) would be: 1,247 pedestrians killed by a human operating a car, 1 pedestrian killed by a computer operating a car.

This is just a look at the stats, not a judgement. The statistics show that humans do a terrible job when it comes to pedestrian safety (I am sure responsibility falls on both sides: Pedestrians and drivers).

On a personal note: It had to be Uber, right? The company playing catch up. The one found guilty of stealing Waymo's secrets to bolster their own autonomous program. The company busted of doing numerous nefarious things to get ahead. The one booted from testing in San Francisco for refusing to buy the proper permits. Is it no coincidence that the company who time and time again has been caught being reckless has the first autonomous death on it's hands? Is anyone surprised?
 
Originally Posted By: MrHorspwer
Using these statistics, this year's tally (so far) would be: 1,247 pedestrians killed by a human operating a car, 1 pedestrian killed by a computer operating a car.
This is just a look at the stats, not a judgement. The statistics show that humans do a terrible job when it comes to pedestrian safety (I am sure responsibility falls on both sides: Pedestrians and drivers).

Not quite "statistics". You have to factor in how many autonomous cars are out there vs. how many human-driven cars are out there. See the difference?
Assuming there are 50 autonomous cars driving around this year, then you'd have 1/50 or 0.02 accident rate for autonomous cars. Now say there are 100 million cars driving around the U.S. with actual human drivers every day, and thats 1247/100,000,000=0.00001247 accident rate.

See the massive difference so far this year?
(Actually there are over 200,000,000 vehicles registered in the U.S., and I assume only less than half are very active day to day.)

Edit: I just found a report that says we have about 50 self-driving cars in the U.S. total, not 500 as I first guessed.
 
Last edited:
Originally Posted By: edwardh1
there is a bill in congress to prevent victims from sueing when hit by an automatic car, you have to use arbitration which is secret.
and when you accept Uber rules, you agree to that now.
not good

How about some documentation?
 
I go to ASU where these Uber Volvos are driving around all the time, I see probably 5 a day. They do a good job.

The woman walked out into the street and the car that hit her happened to be an autonomous.
 
It's unfortunate what happened to this person do we know the details yet, could they have some fault here? I'm just asking.
You can't stop progress whatever happened here will have an effect to correct problems and make autominis cars safer, in the future it will cost more to insure a person driving than an autominis car because the autominis car is far safer than a human.
What happened first, the stop sign, traffic light, striped roads, divided highways or the automobile? once the automobiles came those things had to be learned and people lost their lives, this is no different. Progress is always an evolution and trial and error.
TOTO.
 
Originally Posted By: Nick1994
I go to ASU where these Uber Volvos are driving around all the time, I see probably 5 a day. They do a good job. The woman walked out into the street and the car that hit her happened to be an autonomous.

We know the woman could have just darted out there, walking or jogging with her bike beside her, without looking into traffic. It could be her fault. Maybe the Uber Volvo tried to stop in time. Witnesses, I hope, can help clear it up. Should the Volvo have seen her? Maybe, maybe not. Tragic.

All I see in the article's pictures was a dent on the corner of the hood, meaning it wasn't a square, middle hit. Could be one clue.

All that said, there still should be some accounting of whether or not the sensors can "blank out" at times, and whether the operating system and algorithmic software doesn't make detection and brake actuation glitches. Like safety critical avionics software on airplanes where the FAA reviews and makes sure it won't crash an airplane due to a software and/or hardware bug.
 
Last edited:
Idiots think self driving 18 wheelers are around the corner....
smirk.gif


Lawyers just drool at the idea of self driving commercial trunks.
 
There's a spinning camera on the roof. I wonder if it records too? Or if there's a dashcam?
 
Status
Not open for further replies.
Back
Top