This doesn't look good for self driving cars.

Status
Not open for further replies.
I think what has to happen is a government standard autonomous system driving test. A system has to pass dozens of tests to be certified for use on a public road. Also the system needs an accident recording system to verify what it did before the accident was in compliance with the testing. If it failed to work right, manufacturer gets sued. Failed because of poor maintenance or improper driver override, driver/insurance gets sued.
Right now I suspect these systems are hesitant to slow down and go into a potential avoidance mode because it makes the system seem "jumpy" or nervous. If I see someone on a bike approaching the road perpendicular at a high rate of speed even 100' away I will atleast get ready to do something if needed. If its a kid dragging their feet and looking scared then obviously I would stop or slow long before they got in front of me. I'd guess right now, an automated system does a bad job of identifying when something coming up looks wrong, especially approaching the road perpendicularly. Or it can't recognize when it doesn't have enough visual cues to maintain its current speed safely. Sure lots of people drive less safely than is optimal too of course, but they can't be engineered to be safe where an automated system can.
 
Originally Posted By: eljefino
I'm waiting for a snowstorm where all the automated 18-wheelers call it quits and park like zombies on the shoulders... or in the travel lanes.


Or create a huge pileup someplace.
 
Initial reports is that the women walked out in front of the vehicle not at a crosswalk.There was little time to react. Whether it was a human or self driving car probably made no difference according to reports. I think I will wait for a final accident report before I condemn or praise self driving cars.
 
Originally Posted By: Shannow
Originally Posted By: PeterPolyol
We all knew this day was coming.


Now comes the elephant in the room that I've been pointing at for ages.

Who gets charged, and who gets sued ?

Who made the decisions that the car "made" through their programming of said car ?

For example, in Oz law, if I grab the steering wheel or other controls while my wife is driving, I equally own the outcome in an accident...been drinking, then it's a DUI, from the passenger seat.

Yeah man, already cringing at the litigious outcomes of this kind of scenario- having to prove fault of the hardware, programming, no recourse due to disclaimed responsibilities by prepared parties, jamming or other such disruptive noise fields, sabotage (by competitors *ahem* google, or hackers)?

The reliance on road lane guideline and visual landmarks is a good point too (and outright scary to consider TBH)- when the last time you couldn't drive straight without lane markers? The machines are simply too stupid to know (IMO) when something is just out of the ordinary and machines can't experience premonition and intuition like humans can. Some people can poohpooh that phenomenon, but I know for sure 'premonition' and 'intuition' have allowed me to avoid some real bad situations. The machines only know what they're told prior (including machine learning rules) and are limited strictly to reactions after the fact. All the driver aid tech we're seeing mandated in vehicles is really just real-time, field-testing and development for robot drivers (with real humans and their insurance policies assuming the risks for this beta field trial). Seems clear to me that the tech is still in infancy and a huge risk, thus very limited trials and vehicles OTR.
The only praises I'm seeing for robot drivers is from those that compare them (the very limited data set mind you) to the worst human drivers that make the worst decisions. Too easy and hardly comprehensive.

Another thing we're not seeing or able to test is how an entire roadful of independent radar-guided vehicles will play with each other; if one experiences a massive unexpected error what will the others do?
Originally Posted By: zzyzzx
Originally Posted By: eljefino
I'm waiting for a snowstorm where all the automated 18-wheelers call it quits and park like zombies on the shoulders... or in the travel lanes.


Or create a huge pileup someplace.

And then the weather considerations and crisis.. No way in heck I'd be siting in a robot car trying to evacuate a wildfire or something, heck I wouldn't even want to be on the same road as them in that event- probably winding up disabled as obstructions preventing others from travelling. Earthquakes, floods lol forget it. We as a society should probably stop disrespecting our own human conscious capacity in favor of propping up our technological arrogance.
 
Last edited:
Regarding the arguments for and against self-driving cars, I think it is totally unacceptable to have more than 32,000 people killed by motor vehicle accidents here in the United States each year. And every year there is a similar percent of the populations killed in other countries that have motorized-vehicles.

OK, the current level of the technology is no-where near perfect. It currently might even be below the level that should be required before allowing it to be on public roads. But once the technology is perfected it will literally save tens of thousands of lives each year. It is just a matter of time, money, and man-hours that must be thrown into the projects before it is perfected.

The current number of deaths and injuries from motor-vehicle accidents is too easily accepted and even ignored. I remember the protest to stop the Vietnam war, and how the protest were in the news on tv so often, and there were fewer people being killed each year from that war than there are people being killed each year by motor-vehicle accidents. If people realized that self-driving vehicles could save tens of thousands of lives each year perhaps the general public would pressure the government to fund the development of this technology at a quicker pace.
 
Last edited:
Originally Posted By: PeterPolyol
Originally Posted By: Shannow
Originally Posted By: PeterPolyol
We all knew this day was coming.


Now comes the elephant in the room that I've been pointing at for ages.

Who gets charged, and who gets sued ?

Who made the decisions that the car "made" through their programming of said car ?

For example, in Oz law, if I grab the steering wheel or other controls while my wife is driving, I equally own the outcome in an accident...been drinking, then it's a DUI, from the passenger seat.

Yeah man, already cringing at the litigious outcomes of this kind of scenario- having to prove fault of the hardware, programming, no recourse due to disclaimed responsibilities by prepared parties, jamming or other such disruptive noise fields, sabotage (by competitors *ahem* google, or hackers)?

The reliance on road lane guideline and visual landmarks is a good point too (and outright scary to consider TBH)- when the last time you couldn't drive straight without lane markers? The machines are simply too stupid to know (IMO) when something is just out of the ordinary and machines can't experience premonition and intuition like humans can. Some people can poohpooh that phenomenon, but I know for sure 'premonition' and 'intuition' have allowed me to avoid some real bad situations. The machines only know what they're told prior (including machine learning rules) and are limited strictly to reactions after the fact. All the driver aid tech we're seeing mandated in vehicles is really just real-time, field-testing and development for robot drivers (with real humans and their insurance policies assuming the risks for this beta field trial). Seems clear to me that the tech is still in infancy and a huge risk, thus very limited trials and vehicles OTR.
The only praises I'm seeing for robot drivers is from those that compare them (the very limited data set mind you) to the worst human drivers that make the worst decisions. Too easy and hardly comprehensive.

Another thing we're not seeing or able to test is how an entire roadful of independent radar-guided vehicles will play with each other; if one experiences a massive unexpected error what will the others do?
Originally Posted By: zzyzzx
Originally Posted By: eljefino
I'm waiting for a snowstorm where all the automated 18-wheelers call it quits and park like zombies on the shoulders... or in the travel lanes.


Or create a huge pileup someplace.

And then the weather considerations and crisis.. No way in heck I'd be siting in a robot car trying to evacuate a wildfire or something, heck I wouldn't even want to be on the same road as them in that event- probably winding up disabled as obstructions preventing others from travelling. Earthquakes, floods lol forget it. We as a society should probably stop disrespecting our own human conscious capacity in favor of propping up our technological arrogance.


Actually, there are some cases where machines can see things going on ahead that are out of visual range.
 
Last edited:
Originally Posted By: JimPghPA
Actually, there are some cases where machines can see things going on ahead that are out of visual range.


That would probably be some type of pedestrian detection system that many car makers have. Usually some kind of infrared HUD system.
 
Originally Posted By: JimPghPA
Regarding the arguments for and against self-driving cars, I think it is totally unacceptable to have more than 32,000 people killed by motor vehicle accidents here in the United States each year. And every year there is a similar percent of the populations killed in other countries that have motorized-vehicles.


I'm with you Jim, I think most people are not happy with road fatalities in any amount. The hearts in the right place, but robot drivers may not at all be the solution they're sold as.

Originally Posted By: JimPghPA


Actually, there are some cases where machines can see things going on ahead that are out of visual range.

I've seen the commercials- where radar sensors can detect an object changing velocity past the vehicle directly in front. They can 'see' something but they don't know what it is most of the time. There have been recalls already to correct just regular driving-assisted vehicles for halting when they shouldn't be.

https://www.technologyreview.com/s/603885/autonomous-cars-lidar-sensors/
Quote:
Radar sensors can’t see much detail, and cameras don’t perform well in conditions with low light or glare. A Tesla vehicle ran into a tractor-trailer last year, killing the car’s driver, after the Autopilot software couldn’t make out the trailer against a bright sky.

Tesla uses only radar and cameras btw. The third piece of the puzzle is the scanning lidar, making three basic technologies used in most autodriving programs the radar, cameras and lidar.

Lidar is used to map a 3D image on the fly- they're those conspicuous domes you see on top of the autodriver vehicles. The scanning Lidar is bulky, expensive, in very high demand at the moment, (current availability issues) and not as reliable as we would like- as mentioned in that article.

"Waymo", another company with a stupid name, is trying their own implement of visual, radar and lidar sensors configuration.
https://medium.com/waymo/introducing-waymos-suite-of-custom-built-self-driving-hardware-c47d1714563
Small improvements, but leaves plenty to be desired for an ultimately safe product IMO.


Lidar can be easily jammed as well, severely compromising the safe operation of the autodriving vehicles. Newer technologies to replace or at least supplement the lidar for 3D mapping are still in the pipe, apparently. The autodriver program has a LOT of work and development still left to go before I'd ever consider it safer than human drivers in a large implement.

Seems like the safest autonomous drivers to date are bus drivers
laugh.gif
You're not going to get this kind of critical situational awareness from code, that's for sure!
(Warning: Graphic)
 
Originally Posted By: PeterPolyol

Lidar can be easily jammed as well, severely compromising the safe operation of the autodriving vehicles. Newer technologies to replace or at least supplement the lidar for 3D mapping are still in the pipe, apparently. The autodriver program has a LOT of work and development still left to go before I'd ever consider it safer than human drivers in a large implement.


I'm not sure why that's a big negative. Eyes can easily be jammed too, either by shining lasers into a driver's eye (about as common as Lidar jamming), and then there's the usual, rain, snow, fog.
 
Originally Posted By: bdcardinal
Never underestimate the ability of the general public to violate the laws of physics on a daily basis.

I get nervous enough being in a vehicle with someone else driving, I couldn't stand being in an automated vehicle.




Do you get nervous riding automated trams/trains or other autonomous transportation that has been around for a number of years now?
 
Almost all commercial flying is done with autopilot controlling the plane.

Seems an even safer form of transportation statistically.

UD
 
Originally Posted By: bdcardinal
Never underestimate the ability of the general public to violate the laws of physics on a daily basis.

I get nervous enough being in a vehicle with someone else driving, I couldn't stand being in an automated vehicle.


Just exactly which laws of physics is the general public breaking? Speed of light?

Even Scotty couldn't change the laws of physics.
 
Originally Posted By: UncleDave
Almost all commercial flying is done with autopilot controlling the plane.

Seems an even safer form of transportation statistically.

UD


The sky/atmosphere has an exponentially larger area to operate in safely without encountering nearly as much traffic as the entire surface of the earth. Then, consider the percentage of the surface of the earth that is a motorway, and the type of traffic seen in this limited road space relative to aircraft. Then again consider the level of precision required to keep road travelling vehicles from contacting each other vs the level of precision (or rather the spatial margin of error) that an aircraft needs to maintain to avoid contact with other airborne aircraft.

GPS precision is plenty for aircraft, not so much for vehicles
wink.gif
 
Last edited:
Originally Posted By: PimTac
Originally Posted By: bdcardinal
Never underestimate the ability of the general public to violate the laws of physics on a daily basis.

I get nervous enough being in a vehicle with someone else driving, I couldn't stand being in an automated vehicle.




Do you get nervous riding automated trams/trains or other autonomous transportation that has been around for a number of years now?


They're operating on tracks and cannot steer and upon which other vehicle types cannot and are not allowed to operate on without authorization (like a hi-rail). Nevertheless, just check AMTRAKs history and certainly someone out there will be concerned for their safety
 
Originally Posted By: UncleDave
Almost all commercial flying is done with autopilot controlling the plane.


Not many people step out into the sky in front of an airliner. Even the other airliners are typically kept tens of miles away when they're not near an airport.

And, despite the easy environment, autopilots regularly fail, and drop control back into the hands of the pilots. At which point the pilots may fly a perfectly good airliner into the sea.
 
Originally Posted By: Wolf359
Originally Posted By: bdcardinal
Never underestimate the ability of the general public to violate the laws of physics on a daily basis.

I get nervous enough being in a vehicle with someone else driving, I couldn't stand being in an automated vehicle.


Just exactly which laws of physics is the general public breaking? Speed of light?

Even Scotty couldn't change the laws of physics.






But Scotty did help invent transparent aluminum.
 
Originally Posted By: IndyIan
I think what has to happen is a government standard autonomous system driving test.


And, as anyone in IT can tell you, when there's a mandatory benchmark, everyone programs to the benchmark (remember VW's diesels?). You'll get systems which work perfectly on the test, and behave very differently elsewhere.

This is potentially a huge problem with self-driving cars, which use neural networks that will work perfectly in some conditions and fail horribly after a slight change to their conditions. I recently read of one example where researchers added a couple of lines to a stop sign, and the computer that had previously recognized it as a stop sign then decided it was a cat.
 
Originally Posted By: PimTac
But Scotty did help invent transparent aluminum.



Yeah, I think he could bend the rules a little too. Too bad he never went over what he was doing in that Dyson's sphere.

Reminds me of 186,000 miles per second, it's not just a good idea, it's the law!

Anarchy, it's not the law, it's just a good idea.
 
Viewing the just released dash cam video from the self-driving vehicle, IMO I don't think that a human driven/controlled vehicle could have avoided hitting the stupid woman. Also, it appears that the vehicle DID apply the brakes just before hitting her, just like a quick-acting human driver would have done, and the vehicle came to a complete stop immediately after hitting her. Based on the circumstances and the video, I don't think that a lawsuit could be won, not if I were a juror.
 
Last edited:
Originally Posted By: PeterPolyol
Originally Posted By: UncleDave
Almost all commercial flying is done with autopilot controlling the plane.

Seems an even safer form of transportation statistically.

UD


The sky/atmosphere has an exponentially larger area to operate in safely without encountering nearly as much traffic as the entire surface of the earth. Then, consider the percentage of the surface of the earth that is a motorway, and the type of traffic seen in this limited road space relative to aircraft. Then again consider the level of precision required to keep road travelling vehicles from contacting each other vs the level of precision (or rather the spatial margin of error) that an aircraft needs to maintain to avoid contact with other airborne aircraft.

GPS precision is plenty for aircraft, not so much for vehicles
wink.gif




Yet the complexities of commercial traffic around an airport are extreme add to that z space and assymetrical thrust plus operator issues come into play - (think air france airbus crash )

The safety improvement that I've seen measured to date demonstrates each piece of automation adds to overall safety and crash avoidance.

The IHLDA has some pretty good data here.

Anyone have any data?

UD
 
Last edited:
Status
Not open for further replies.
Back
Top