Self-driving Uber car killed a pedestrian

Status
Not open for further replies.
Not sure who you used to be or if you read the entire thread...

But I'm not an advocate of self driving cars...

The people (Musk and Uber), who are putting these machines on our roadways are the people who should be held to account for their products.

Their stating in the media and their owner's manual that the person who has been removed from every activity bar the activation of emergency protocols is ridiculous, when

There is a massive amount of industrial and military experience that removing the operator from the process delays their response time, and error rate when they HAVE to correct the catastrophe...they (the car people) are claiming cutting edge, error free advances, and relying on (and blaming) on some of the least trained people on the planet to be their emergency back-up.


But they are getting away with it...see the body of the thread...


Again, I'm not in favour of this technology.

Go look at older threads...if I grab the steering wheel while my partner is driving, I share the legal liability for the outcome...the programmer of any of these technologies that even intervene in the operation of the vehicle are in exactly the same position...they share, proportionally on their degree of control of the vehicle the results of the outcome.
 
And I’m telling you that you cannot make a programmer responsible for anything, at least not after 30 years of software megacorporations brainwashing everyone into thinking that they have no liability whatsoever.

The only liability you can possibly assign is to those that approved these things on the streets.
 
Originally Posted By: nap
And I’m telling you that you cannot make a programmer responsible for anything,


Why can't they lose their house like anyone else putting the public or environment at risk ?

Programming the behaviour of an assault vehicle in Halo is remarkably similar to removing the factory built in systems in a Volvo and sending it onto the street...consequences in running over a paper bag crossing the road with her bicycle are very different...why are they NOT held to the same standard ?

Other bookend on the argument is that the guy who wipes the windshield, unplugs the charging cord and opens the boomgate as it exits the parking station is the one who deploys the technology...he has ZERO control of the technology.
 
Short answer: the software industry invest millions every year into lobbying for not having any laws making them liable for anything.
 
Originally Posted By: nap
Originally Posted By: Wolf359


The video camera doesn't have the dynamic range that the eyes have. When have you been driving at night and not been able to see in good weather? That's what headlights are for. While unexpected that's just every day driving. People do crazy things all the time.


Not really. People will adapt their speed to the visibility conditions. It’s there in the driving manual.

They also have to take an eyesight exam periodically. If they can’t see well enough they don’t get a driving license (renewed).

Apparently these self driving wonders are exempt from such laws.


If you look at the video, the speed wasn't too crazy, around 40ish which was fine for that size of road. If the driver behind the wheel had been paying attention, she would have been able to stop in time without a problem. If anything, I think self driving cars would drive normally, that type of wide open road at night would invite speeding by regular drivers.
 
Originally Posted By: nap
And I’m telling you that you cannot make a programmer responsible for anything, at least not after 30 years of software megacorporations brainwashing everyone into thinking that they have no liability whatsoever.

If the system gets rid of human drivers who are insured as they currently are, the lawyers and insurance underwriters will be sure to work very hard to reverse this. Of course, as always, the consumer still pays.
 
Originally Posted By: nap
It’s all nice and cool until you realize that you may be the next paper bag and nobody would take any responsibility for that.


"Paper bags" also need to learn how to look both ways before they blindly cross the street.
wink.gif


In the heavily populated area I live in, I hear quite often on the news about someone walking across a dark street that got ran over by someone and killed - and the drivers never get charged because of the circumstances. Even if someone is paying pretty good attention, it's possible they will not react fast enough. I've even had some yah-hoos cross in front of me a few times in the dark, and it's really something you're not expecting to happen.
 
Originally Posted By: nap
Short answer: the software industry invest millions every year into lobbying for not having any laws making them liable for anything.


That won't stop good lawyers and civil lawsuits.
 
Originally Posted By: ZeeOSix
Originally Posted By: nap
Short answer: the software industry invest millions every year into lobbying for not having any laws making them liable for anything.


That won't stop good lawyers and civil lawsuits.


Lawsuits are based on the existence of some laws that someone (allegedly) infringed.

In the absence of any applicable laws it’s very difficult to win a lawsuit, no matter how good the lawyers are.
 
Originally Posted By: nap
Programmers don’t have a “professional engineer” program. The onus is completely on the guys who decided to deploy these things in the streets.


We have a lot of people that will get fired if you are not performing good for 6 months, many top performing companies fire the bottom 6-10% of their staff per level per year.

Certified professional is not always as competitive after they get their certificate.

The problem with UBER is they are trying to rush things before it is ready. Like war time technology developments, you'll see a lot of plane crashes and test pilot deaths in an arm race, automated or not.
 
Status
Not open for further replies.
Back
Top