Originally Posted by JeffKeryk
Originally Posted by KrisZ
Originally Posted by JimPghPA
Every year here in the US there are 30,000 people killed ( +/- a couple of thousand or so) in vehicle accidents with vehicles that are driven by humans.
Someday vehicles with AP will reduce that yearly death rate to much less than one percent of that totally unacceptable number of deaths.
That remains to be seen. Tesla can get away with claiming their AP system is near 100% safe because it is not fully autonomous and they can always blame driver error, which they did every single time.
Actually, the NHTSA determines who is at fault.
Tesla consistently works with and shares information with NHTSA on Tesla involved accidents.
Can you cite a case where Tesla disagrees with the NHTSA?
I would be interested, as we have a Model 3 with AP.
The vast majority of accidents are caused by human error.
Tesla warns drivers to keep control while using AP.
Here are results form a recent study:
Tesla AP vs Tesla vs National Average for accidents
NHTSA can't see past their own nose when it comes to automation. They won't consult other industries that extensively studied automation implementation and human controls. So when Tesla presents them with data showing someone did not pay attention or was not holding the steering wheel, they will blame the driver automatically.
But what is not being addressed is the fact that poorly implemented automation leads to operators trusting the system too much and eventually letting their guard down. Why isn't NHTSA addressing this?
Edit:
Don't think I'm criticizing these systems just because I feel like it and thinking their unsafe. Far from it. I think they make a great supplement for scenarios that are boring to humans, such as stop and go traffic, or poor conditions. But they have to be implemented in such a way that the driver still remains near 100% attentive.