Tesla AutoPilot runs over and kills motorcycle rider

Only the driver will be to blame though. I wish people that praised this technology realized as the driver they are still responsible. It’s a driver’s aid and I don’t understand why it’s worth others lives to not be alert and actively drive their vehicle.

A novelty like this should not be street legal.

I really dont get whats so hard about using any of the (from any brand) available driving aids responsibly though, yet plenty seem to have trouble with them.

Im a motorcycle rider and ride as though everything were out to kill me.
 
The specs seem to suggest its quite a bit safer per mile driven.

Didn't this system just hit a billion miles?

I think only Mercedes and Tesla print their stats, curious what the data really says.

 
Is there a point to you quoting this back to me? The article also says the car was in autopilot mode, lurched forward and hit the biker. In other words, he set it on autopilot so he could screw around with his phone while driving. The tools that make this possible should be taken away from the general public because they're obviously not capable of using it responsibly.
Possible solution..instead of autoPilot..how about driving through ones cell phone..just throwing it out there..
 
I really dont get whats so hard about using any of the (from any brand) available driving aids responsibly though, yet plenty seem to have trouble with them.

Im a motorcycle rider and ride as though everything were out to kill me.
It’s human nature. It’s not possible to get around the fact that we’re slower to react when not actively engaged with the task at hand. Some are worse than others at this.
 
It’s human nature. It’s not possible to get around the fact that we’re slower to react when not actively engaged with the task at hand. Some are worse than others at this.
Curious... How did FSD get the driver to use his cell phone? Wonder what he was doing, because you can easily have a conversation over the Tesla system, right?

You can't fix stupid. Heck, do we really know if AP was on? There have been plenty of crashes blamed on it that turned out to be false. Regardless, you need to use a tool correctly and for its intended use.

And, AI is the future.
 
Curious... How did FSD get the driver to use his cell phone? Wonder what he was doing, because you can easily have a conversation over the Tesla system, right?

You can't fix stupid. Heck, do we really know if AP was on? There have been plenty of crashes blamed on it that turned out to be false. Regardless, you need to use a tool correctly and for its intended use.

And, AI is the future.
Yes, that does make it worse, but even then when not using other distractions it is harder to actively take over. Who knows if he was actually using the system. I'm still looking at it from best case scenario if FSD was being used. My only point is that even while using the tool correctly human nature ends up with a slower reaction when intervening instead of just reacting while driving. This is a conversation that comes up regularly in my career. Some locomotives have a system like this called Trip Optimizer. The alerter goes off at set intervals when no inputs have been given to the system much like the car prompting to grab the wheel. In a locomotive this is a yellow button that must be hit to show that the engineer is alert and not sleeping. The problem is that because we're not actively operating the train, it's much easier to fall asleep when tired. We're not allowed distractions like cell phones. The system has no way to comply with signals and directives and is only programmed to the prescribed set speeds.

There's been a lot of studies on this. When not actively in control, but intervention is required, the resulting intervention is usually delayed and sometimes the wrong reaction just because we're not actively operating and reacting as normal. It's easy to be lulled into a state of complacency when we're really not doing anything. It's just how the brain works. We can't really beat this with technology. It's especially disturbing because so many do trust FSD and you can't tell them otherwise.
 
This has been a big issue with the American Motorcycle Association

I loosely have followed it since I get the magazine, but they are proactive in Washington with motorcycle related issues and they have a big problem with autonomous and self driving vehicles being able to sense motorcycle riders.

In this case, they are charging the owner of the vehicle with homicide


Here is another source, I’m not sure which is the latest news. FOXBusiness said he has been charged.

The only way autonomous driving cars work, is by making the car maker responsible for all damages at fault when the car is driving autonomous. Not the owner, who is a passenger at that time, and not his insurance.

But as it is today, I'm sure the owner/passenger is responsible. I wonder what insurance companies say about autonomous driving when stuff like this happens.
 
The specs seem to suggest its quite a bit safer per mile driven.

Didn't this system just hit a billion miles?

I think only Mercedes and Tesla print their stats, curious what the data really says.

I wouldn't trust alot of "reports" from Tesla. From my understanding and digging the accident rate per mile went up after removing the radar sensors as Teslas autopilot software was designed with it to coexist with the cameras. Engineers told Musk that removing the radar was a stupid idea.
"It found that Tesla drivers are involved in more accidents than drivers of any other brand. Tesla drivers had 23.54 accidents per 1,000 drivers. Ram (22.76) and Subaru (20.90) were the only other brands with more than 20 accidents per 1,000 drivers for every brand." This was extensively investigation by Forbes and was published in December of 2023, so fairly recently. This is the same company that is now being investigated for overstating their milage on the model y and 3.
 
I wouldn't trust alot of "reports" from Tesla. From my understanding and digging the accident rate per mile went up after removing the radar sensors as Teslas autopilot software was designed with it to coexist with the cameras. Engineers told Musk that removing the radar was a stupid idea.
"It found that Tesla drivers are involved in more accidents than drivers of any other brand. Tesla drivers had 23.54 accidents per 1,000 drivers. Ram (22.76) and Subaru (20.90) were the only other brands with more than 20 accidents per 1,000 drivers for every brand." This was extensively investigation by Forbes and was published in December of 2023, so fairly recently. This is the same company that is now being investigated for overstating their milage on the model y and 3.

Im unsure. The NHSTA has/demands access to all that data Tesla publishes.

Forbes and You are quoting the data from the lending tree report which wasn't based on actual in car accidents, but people looking to buy insurance/ loans using a "quote wizard".
 
Last edited:
Im unsure. The NHSTA has/demands access to all that data Tesla publishes.

Forbes and You are quoting the data from the lending tree report which wasn't based on actual in car accidents, but people looking to buy insurance/ loans using a "quote wizard".
This may help if it is what you are referring too. I haven't followed all the pages in this thread. In this detailed report by the NHSTA first paragraph is mentioned 936 accidents. I still have to read it but it mentions those 936 accidents while on autopilot that they believe would have been preventable with an "attentive driver"? Still have to read it, either way over 20 people died in the last less than two years based on autopilot alone.
 
Last edited:
This may help if it is what you are referring too. I haven't followed all the pages in this thread. In this detailed report by the NHSTA first paragraph is mentioned 936 accidents. I still have to read it but it mentions those 936 accidents while on autopilot that they believe would have been preventable with an "attentive driver"? Still have to read it, either way over 20 people died in the last less than two years based on autopilot alone.
I'll check this out thanks.

Being able to instantly supply telematics is really helpful but there are situations where that doesn't help determine causality.

Most fatal accidents are probably preventable with an attentive driver or are caused by an inattentive or distracted driver.

With a 5X improvement in crash reduction over a billion miles of use, the data would suggest more than 20 people were saved by the system.

Should we eliminate all driver aids even if the data says they improve safety because they aren't 100% ?

Do we blame the driver the car for fatalities? As long as there is a steering wheel and controls its up to the driver to use them correctly. Otherwise it's a lifetime of Ford crash, a GM crash, a Honda crash, Tesla crash vs the person who should have been in control.
 
Last edited:
Back
Top