Model 3 - Full Self Driving - Soon?

Status
Not open for further replies.
Joined
Jan 9, 2010
Messages
19,156
Location
Los Gatos, CA
Model 3 around Portola Valley

This starts near the Stanford campus, goes north in Interstate 280 and returns.
The area is known as Portola Valley; come and see for yourself. You gonna love it...

A couple of years ago I was in the middle lane and spotted some really tiny bright colored cars in the slow lane; coming fast...
3 Lamborghini Miuras, followed by a Pantera which was tall in comparison.
Have mercy!
 
What I want to know since government seem to be behind this when you're not driving and this new idea really takes off who's going to be responsible for the accidents? Government? The car manufacturer the auto piloting company. Why are we going to need insurance anymore? If the car alerts you its stolen will it drive the occupants directly to the closest police department?
 
Originally Posted by JohnnyJohnson
What I want to know since government seem to be behind this when you're not driving and this new idea really takes off who's going to be responsible for the accidents? Government? The car manufacturer the auto piloting company. Why are we going to need insurance anymore? If the car alerts you its stolen will it drive the occupants directly to the closest police department?

I can tell you, the driver (probably owner) is responsible.
I like the stolen car idea. Kinda like Sentry Mode. The Model 3's were being broken into at an alarming rate; thieves looking for laptops and whatever. The programmers authored "Sentry Mode", which turned on the cameras and put a huge warning message on the touch screen. Started filming if you got too close or touched it without the card or cell phone. Delivered by an overnight software release.

By the way, it is really hard to steal one because there is no key. And it does alert you as to where the car is.
You have to have the card or the app on your cell phone.
 
No way should the "driver" be responsible in a "self driving" car...why on earth should they ?

They didn't programme nor deploy the technology...the people who DID hold a burden of responsiblity, and probably the majority of it.

Under Oz law, if I grab the steering wheel while my wife is driving, ostensibly to avoid something that she failed to see, then I'm considered an equal contributor, or worse, in the event of a collision.

If an autonomous vehicle lanes changes, or sees a string of trash as a lane markings and drives into oncoming traffic....that;s the OEM and the programming team, not the passenger.

It is well know in industry since the advent of automation that when you remove the operator from the process of operating the equipment, that they can't be expected to respond to an incident, become fully situationally aware on what the current state is, how it got there, and respond appropriately. It's unrealistic, and completely unfair on the operator.

To make a statemetn in the manual that "You are fully responsible" is utterly irresponsible, and won't survive lawsuits, where responsiblity will untimately be placed where it belongs...the people who equipped the self driving vehicle with a full (or lesser) complement of sensors, and who programmed it to see a woman and bicycle as a trash bag blowing across the road.
 
Originally Posted by Shannow
No way should the "driver" be responsible in a "self driving" car...why on earth should they ?

They didn't programme nor deploy the technology...the people who DID hold a burden of responsiblity, and probably the majority of it.

Under Oz law, if I grab the steering wheel while my wife is driving, ostensibly to avoid something that she failed to see, then I'm considered an equal contributor, or worse, in the event of a collision.

If an autonomous vehicle lanes changes, or sees a string of trash as a lane markings and drives into oncoming traffic....that;s the OEM and the programming team, not the passenger.

It is well know in industry since the advent of automation that when you remove the operator from the process of operating the equipment, that they can't be expected to respond to an incident, become fully situationally aware on what the current state is, how it got there, and respond appropriately. It's unrealistic, and completely unfair on the operator.

To make a statemetn in the manual that "You are fully responsible" is utterly irresponsible, and won't survive lawsuits, where responsiblity will untimately be placed where it belongs...the people who equipped the self driving vehicle with a full (or lesser) complement of sensors, and who programmed it to see a woman and bicycle as a trash bag blowing across the road.


This! But you see its just like the LSPI problem the industry has pushed onto their customers. No in fact all insurance should be moved to the manufacturers and the auto pilot industries. Along with the mandatory insurance by owners.

I'm personally not paying someone for a product to curtail my freedom.
 
Of course the full answer is complicated. The vast majority of accidents today are caused by human error.
Self driving cars are expected to reduce accidents and save lives.

If the accident was caused by AP malfunction, then the automaker should be held responsible.
Just like faulty brake design, etc.
The technology is new and evolving. Laws and lawsuits will evolve as well.

I can tell you, the 1st time you switch on AP it will blow your mind.
It is extremely scary and awesome at the same time.

A precedent has been sent...
In May 2016 the first death in an autonomous car occurred in Ohio.
The driver set cruise to 70 plus mph.
A trailer truck cut off the Tesla which did not slow because sunlight blinded the cameras.
The NHTSA's investigation found no fault in the technology.
They found the driver responsible because he was not paying attention.
Apparently he was watching a Harry Potter flick. He was killed in the accident.
Good Lord... You can't fix stupid.

I have always said these cars are not for everyone. This is another example.

Where will this go? Who knows.
 
Last edited:
Originally Posted by Shannow


They didn't programme nor deploy the technology...the people who DID hold a burden of responsiblity, and probably the majority of it.

In our car, you have to enable AP by pulling down the right stalk twice.
 
Not sure on the legal aspects, but mandating that the driver pays attention. Why use AP then? I would rather drive myself if I cannot do something else while in AP mode. Sad that people have paid with their lives being early adopters. To me AP is worthless in its current form.
 
I didn't know the government was involved in autopiloting. Sounds like incorrect news. Sounds like they weren't thorough not to think of sensor blocking in the accident. I don't even drive with the "radio" on. I don't want a big screen in the middle to look at sideways. I don't even like the small one in my car now. I want to see displays in front of me not to the side.
I looked up Tesla roadsters yes the ones I saw were the originals. In person they look even smaller than the pictures.
I know you're a Tesla owner, just my opinions.
 
Originally Posted by mbacfp
Not sure on the legal aspects, but mandating that the driver pays attention. Why use AP then?


Indeed. We've seen in the aviation world that the 'OK, the autopilot doesn't have to be perfect because we can just drop the problem in the pilot's lap when we don't know what to do' solution has killed hundreds of people when the pilots fail to react properly, or couldn't do so for some reason. And those are trained crews who typically have several minutes to react and save the aircraft.

In a car, the driver is probably watching pr0n and may have seconds or less to react.
 
Originally Posted by JeffKeryk
A precedent has been sent...
In May 2016 the first death in an autonomous car occurred in Ohio.
The driver set cruise to 70 plus mph.
A trailer truck cut off the Tesla which did not slow because sunlight blinded the cameras.
The NHTSA's investigation found no fault in the technology.
They found the driver responsible because he was not paying attention.
Apparently he was watching a Harry Potter flick. He was killed in the accident.
Good Lord... You can't fix stupid.

I have always said these cars are not for everyone. This is another example.

Where will this go? Who knows.


Legal precedents are set by non technically savvy people in the majority of the cases...it will take time for the behavioural psychologists to get an understanding of the actual human mechnaism throguh the legal system.

You simply CANNOT take the driver out of the action, then expect the driver to remain vigilant and aware, and hand them the twice a year crisis and expect them to figure it out in a few milliseconds
 
Originally Posted by Shannow
Originally Posted by JeffKeryk
A precedent has been sent...
In May 2016 the first death in an autonomous car occurred in Ohio.
The driver set cruise to 70 plus mph.
A trailer truck cut off the Tesla which did not slow because sunlight blinded the cameras.
The NHTSA's investigation found no fault in the technology.
They found the driver responsible because he was not paying attention.
Apparently he was watching a Harry Potter flick. He was killed in the accident.
Good Lord... You can't fix stupid.

I have always said these cars are not for everyone. This is another example.

Where will this go? Who knows.


Legal precedents are set by non technically savvy people in the majority of the cases...it will take time for the behavioural psychologists to get an understanding of the actual human mechnaism throguh the legal system.

You simply CANNOT take the driver out of the action, then expect the driver to remain vigilant and aware, and hand them the twice a year crisis and expect them to figure it out in a few milliseconds



Let's get even more real with this: Most drivers are going to be 100% asleep when the car notifies them when something is going wrong, and they are also going to be 100% unequipped to do anything about it.

When they wake up, most aren't going to know where they are or what's going on.
 
My fear is that AP will be mandated on all vehicles just like every other electronic nanny. U won't have a choice if U want it or not. Mark my words! What happens when cars get older, especially in the rust belt and the AP screws up, and it will! I guess AP would be ok for the injured, elderly, or even alcoholics who need to get home safe. I for one will NOT own a car with AP, not now not EVER! Luddites unite!
 
I have a feeling that personal autonomous vehicles will not be scrutinized as heavily as automation in various industries. An industry faces lost revenue, customers and investors when their automation fails, so they will do everything they can to make it profitable. Consumer grade autonomous vehicles will be the next big thing to extract as much money from the people buying them as well as traveling in them, so that revenue cannot be jeopardized. I think lots of EULAs will be put in place to blame the occupants as much as possible.
And so far, the consumer is not worried at all about automation failing them, despite some serious accidents so far. So it's not like these accidents will scare them away because they have full trust in it. And it seem the worst kind of driver, the more they trust the automation to save the day.
 
Anyone who trusts current AP technology 100% is an idiot.
Currently, Tesla warns owners that Autopilot is a driver assistance tool, not a replacement, and that they retain responsibility for driving safely.

Many petabytes of additional data needs to be stored and analyzed.

That's what Musk is doing. Tesla has like 400K cars on the road.
It has captured billions of miles of data and continues to do so.

The vast majority of accidents are caused by driver error.
What number of accidents are acceptable? Tough question.
For me, it has to be at least far better than that of drivers.

For now, anyone not paying full attention while using AP is an idiot.
Personally, I pay far more attention while on AP.
 
Every year here in the US there are 30,000 people killed ( +/- a couple of thousand or so) in vehicle accidents with vehicles that are driven by humans.

Someday vehicles with AP will reduce that yearly death rate to much less than one percent of that totally unacceptable number of deaths.
 
Originally Posted by JimPghPA
Every year here in the US there are 30,000 people killed ( +/- a couple of thousand or so) in vehicle accidents with vehicles that are driven by humans.

Someday vehicles with AP will reduce that yearly death rate to much less than one percent of that totally unacceptable number of deaths.


That remains to be seen. Tesla can get away with claiming their AP system is near 100% safe because it is not fully autonomous and they can always blame driver error, which they did every single time.
 
Originally Posted by KrisZ
Originally Posted by JimPghPA
Every year here in the US there are 30,000 people killed ( +/- a couple of thousand or so) in vehicle accidents with vehicles that are driven by humans.

Someday vehicles with AP will reduce that yearly death rate to much less than one percent of that totally unacceptable number of deaths.


That remains to be seen. Tesla can get away with claiming their AP system is near 100% safe because it is not fully autonomous and they can always blame driver error, which they did every single time.

Actually, the NHTSA determines who is at fault.
Tesla consistently works with and shares information with NHTSA on Tesla involved accidents.
Can you cite a case where Tesla disagrees with the NHTSA?
I would be interested, as we have a Model 3 with AP.

The vast majority of accidents are caused by human error.
Tesla warns drivers to keep control while using AP.

Here are results form a recent study:
Tesla AP vs Tesla vs National Average for accidents
 
Originally Posted by JeffKeryk
Originally Posted by KrisZ
Originally Posted by JimPghPA
Every year here in the US there are 30,000 people killed ( +/- a couple of thousand or so) in vehicle accidents with vehicles that are driven by humans.

Someday vehicles with AP will reduce that yearly death rate to much less than one percent of that totally unacceptable number of deaths.


That remains to be seen. Tesla can get away with claiming their AP system is near 100% safe because it is not fully autonomous and they can always blame driver error, which they did every single time.

Actually, the NHTSA determines who is at fault.
Tesla consistently works with and shares information with NHTSA on Tesla involved accidents.
Can you cite a case where Tesla disagrees with the NHTSA?
I would be interested, as we have a Model 3 with AP.

The vast majority of accidents are caused by human error.
Tesla warns drivers to keep control while using AP.

Here are results form a recent study:
Tesla AP vs Tesla vs National Average for accidents



NHTSA can't see past their own nose when it comes to automation. They won't consult other industries that extensively studied automation implementation and human controls. So when Tesla presents them with data showing someone did not pay attention or was not holding the steering wheel, they will blame the driver automatically.

But what is not being addressed is the fact that poorly implemented automation leads to operators trusting the system too much and eventually letting their guard down. Why isn't NHTSA addressing this?

Edit:
Don't think I'm criticizing these systems just because I feel like it and thinking their unsafe. Far from it. I think they make a great supplement for scenarios that are boring to humans, such as stop and go traffic, or poor conditions. But they have to be implemented in such a way that the driver still remains near 100% attentive.
 
Last edited:
Status
Not open for further replies.
Back
Top