This doesn't look good for self driving cars.

Status
Not open for further replies.
People get killed everyday by human driven cars. Guess tht's so common that it's not news.
 
Originally Posted By: Leo99
People get killed everyday by human driven cars. Guess tht's so common that it's not news.


The big difference here is that Uber gets to be the target of a lawsuit rather than an individual human driver.

Sad news, my condolences to the lady's family and friends.
 
The human backup did not have time to react, so it must have happened suddenly.

The lady did not cross at a cross-walk. Prossibly some zombie, face-deep in her phone.
 
Just goes to show that programmers canNOT possibly prepare a self driving vehicle for every circumstance...I am against them.
 
Originally Posted By: gathermewool
The human backup did not have time to react, so it must have happened suddenly.


No, as I keep saying, from working in an automated industry, the further that you remove the operator from the action, by taking direct control away from them, the longer they take to identify an issue is occurring, fully assess the situation, and then take corrective action...and then it's often the wrong action.

This is well known from every automated industry.

But the car people refuse to acknowledge what's been known for a half century, and come out with motherhood statements like "the driver is still the ultimate control".
 
Never underestimate the ability of the general public to violate the laws of physics on a daily basis.

I get nervous enough being in a vehicle with someone else driving, I couldn't stand being in an automated vehicle.
 
Do an internet search for "wikipedia yearly vehicle deaths" and "wikipedia yearly vehicle deaths united states"

In 2016 there were 37,461 vehicle-related deaths here in the United States, and every year there are approximately that many.

Someday autonomous vehicles will do away with the tens of thousands of motor vehicle-related deaths here in the US and I say it cant happen too soon.
 
Last edited:
I'm surprised it wasn't going the wrong way on one of our freeways
grin.gif
 
I'm waiting for a snowstorm where all the automated 18-wheelers call it quits and park like zombies on the shoulders... or in the travel lanes.
 
Originally Posted By: Shannow
Originally Posted By: gathermewool
The human backup did not have time to react, so it must have happened suddenly.


No, as I keep saying, from working in an automated industry, the further that you remove the operator from the action, by taking direct control away from them, the longer they take to identify an issue is occurring, fully assess the situation, and then take corrective action...and then it's often the wrong action.

This is well known from every automated industry.

But the car people refuse to acknowledge what's been known for a half century, and come out with motherhood statements like "the driver is still the ultimate control".


No, I'd say it depends - not 'no' and that's it.

I use Eyesight adaptive cruise control, but always keep my foot near the brake while driving and on the brake when there are other cars nearby. If someone wants to cut me off, my foot hitting the brake is delayed only as quick as my brain-to-leg signal path.

I'm sure there are others who cross their legs or sit Indian-style, or whatever. Was the guy in the automated car placing his attention elsewhere or was the event sudden? Has that been reported yet?
 
BTW, if we want to compare statistics of the very small number of self-driving vehicles, to the much larger number of human-driven vehicles, we should look at the totals for the number of days that the self-driving vehicles have been in operation.
 
Originally Posted By: bdcardinal
Never underestimate the ability of the general public to violate the laws of physics on a daily basis.

I get nervous enough being in a vehicle with someone else driving, I couldn't stand being in an automated vehicle.


No kidding! That would make me want to walk!

I'm sure, as this is still very new technology, once bugs are worked out, things will improve greatly, but at the same time, the tech shouldn't be going around killing people before we get there!
 
Originally Posted By: irv
Originally Posted By: bdcardinal
Never underestimate the ability of the general public to violate the laws of physics on a daily basis.

I get nervous enough being in a vehicle with someone else driving, I couldn't stand being in an automated vehicle.


No kidding! That would make me want to walk!

I'm sure, as this is still very new technology, once bugs are worked out, things will improve greatly, but at the same time, the tech shouldn't be going around killing people before we get there!


So, being killed by a human driver is an acceptable risk when you hit the road but being killed by an autonomous driver is worse somehow?

If human driven cars kill 100 people, on average, per day. Why is the threshold for autonomous vehicles zero per day? Wouldn't a 50% reduction in deaths if we switched to autonomous vehicles worthwhile? The cause of the crashes is still the same: people doing stupid unexpected things behind the wheel.
 
We all knew this day was coming.

You need to ask what percentage of vehicles on the road are self-driving? Who wants to plug in the fatality numbers between self-driving and human-operated vehicles? What would the ratio even be, 60million to 1? (rhetorical estimate)
 
Originally Posted By: PeterPolyol
We all knew this day was coming.

You need to ask what percentage of vehicles on the road are self-driving? Who wants to plug in the fatality numbers between self-driving and human-operated vehicles? What would the ratio even be, 60million to 1? (rhetorical estimate)


Note also where these self driving cars, and the Tesla Autopilot are used.

Tesla, straight roads with markings...the driver has to do all the really tricky stuff.

Phoenix Arizona ? sunny, not wet, foggy, snow...by their own admission, the best place to introduce them.
 
Originally Posted By: PeterPolyol
We all knew this day was coming.


Now comes the elephant in the room that I've been pointing at for ages.

Who gets charged, and who gets sued ?

Who made the decisions that the car "made" through their programming of said car ?

For example, in Oz law, if I grab the steering wheel or other controls while my wife is driving, I equally own the outcome in an accident...been drinking, then it's a DUI, from the passenger seat.
 
Originally Posted By: Shannow
Originally Posted By: PeterPolyol
We all knew this day was coming.


Now comes the elephant in the room that I've been pointing at for ages.

Who gets charged, and who gets sued ?

Who made the decisions that the car "made" through their programming of said car ?

For example, in Oz law, if I grab the steering wheel or other controls while my wife is driving, I equally own the outcome in an accident...been drinking, then it's a DUI, from the passenger seat.


Oh, Uber gets sued no matter what, because that's where the money is. The "driver" might possibly get named in a civil suit, but that's really an academic concern as that person is probably not rich enough to really worry about. If Microsoft had a role in the software development, they get sued, Chevy gets sued if they had a role in the self-driving hardware development, etc.
As for criminal proceedings, that is a really good question. Figuring out who/what is at fault on the criminal side could be a tortuous process. Maybe the "driver" gets blamed for knowingly being behind the wheel of an unsafe/unproven self-driving car? Maybe Uber because they told him/her to test the thing?
 
In regards to suing, I agree with Shannow, it will be very tricky. In this particular case it's quite easy, but once we have a larger number of autonomous vehicles being driven by individual operators, not belonging to corporations, the insurance companies will have a real hard time IMO. The owners might end up in lots of hot water as well.
 
Last edited:
Status
Not open for further replies.
Back
Top