Tesla Crash (autopilot maybe) --Automatic Braking?

Status
Not open for further replies.
Joined
Jun 13, 2016
Messages
4,666
Location
MN
Anybody know if Tesla Model S's have autonomous braking, active during autopilot or manual driving, like many cars have it active?
The driver *claims* the autopilot was active, but still, the car should have braked on its own, right?
One just hit a parked fire truck at 65 mph. Looks pretty good for a 65 mph hit, BTW. !!!!
image
 
Saw that article this morning. Impressive of the car to take the hit.

Not much of an autopilot if it can't detect a large red object with flashy lights on it though...

Maybe the car fixated on the truck and steered into what it was staring at?
 
They do have this technology but it’s either the same as other automakers provide or a bit more enhanced version. My thinking is that Tesla should not call it a autopilot. It sends the wrong message to drivers. In the end the driver is responsible for this accident and should be charged with driving without attention or distracted driving if that is the case.
 
At this stage these auto features are just toys. But enhanced braking is not. Anyhow, I doubt we can speculate until the investigation is finished. It'd be interesting to know if DOT or NHTSA are involved.
 
Autopilot LOL you ever see a car manufacturer that never built a car without recall issues.

Originally Posted By: Kestas
At this stage these auto features are just toys. But enhanced braking is not. Anyhow, I doubt we can speculate until the investigation is finished. It'd be interesting to know if DOT or NHTSA are involved.

This junk shouldn't be on the road without a validation like Aircraft have to go through.
 
Originally Posted By: JohnnyJohnson
This junk shouldn't be on the road without a validation like Aircraft have to go through.

I agree 10000%

That's why the self driving trucks won't be seen on the road for the next 20 years.
 
Originally Posted By: JohnnyJohnson
This junk shouldn't be on the road without a validation like Aircraft have to go through.

Exactly. Otherwise the NHTSA or whoever regulates this stuff should have told Tesla not to call it Autopilot. That is deceiving.

My Mazda has that stuff too. LDW with intervention, autonomous braking, radar cruise control etc, but the manual is explicit ( as it should be) that the driver should not depend on these systems because of too many variables.
 
Instead of vigorous testing of new product, the automakers are relying on field testing to gather data. All too often I see automakers rushing product to market without proper testing. Every manufacturer to be first and one-up the competition in this rat race. If something goes bad... oh well!

I wouldn't be surprised if they change the name from "autopilot." Remember that auto manufacturers came up with the name "supplemental restrain system" instead of "air bag," because they didn't want the public relax on seat belt use.
 
So glad that there isn't a fire fighter who is missing a pair of legs because of this jerk.

This is what happens when morons with more money than common sense discover technology, and they're too stupid to handle it.
 
If these sensors are not behind the windshield like the Subaru what happens when the car gets dirty? Do the sensors display a warning message when they are obstructed?

Could this driver be acting as a field tester for as yet still developing technology?

Should cars using this immature technology be somehow marked so that others can stay away if possible?
 
The semi incident they blamed because the autopilot could see under the trailer, so therefore missed it.

This clearly refutes that.

Tesla claims that it's safer than a human, but that's based on when they allow it, which isn't twisty mountain curves (where according to local signs, 9/10 rural accidents occur on curves).

Tesla calims that the driver is still ultimately in control, but as can be seen in industrial automation, the more removed the driver is from the regular operation, the slower he is to parse the alarm screens, identify the issue, and take correct corrective action.

I second (or third, or 4th) the statement that this junk shouldn't be being tested on public roads (google cars or other).

And it should be either full autopilot, or off.

And Tesla, or the worker who wrote the algorythms should be charged as appropriate for every accident that they have partaken in. (Oz law is that if I see my wife heading for an accident, and grab the wheel, I own 50% of the outcome)...should be exactly the same here.
 
Person behind the wheel at fault.
Driver is always supposed to be observant and attending to the controls. How does one not see a fire truck and slow down?
Not a big fan of this technology but once we start blaming the car for something that a human is supposed to be doing, this type of occurrence will just turn into lotteries for the sue happy.
Remember how the Audi 5000(100 outside the U.S.)unintended acceleration lawsuit got started? Mom runs over her kid. Sues Audi, years later admits she stepped on gas pedal. Suddenly various manufacturers facing lawsuits over unintended acceleration. Manufacturers became the scapegoats for peoples mistakes.
 
All new tech like these will go through teething period. Unfortunate that some people will die because we don't know what we are building until then, but in the long run it will be safer for the rest of us survivors.
 
Originally Posted By: 555
Person behind the wheel at fault.
Driver is always supposed to be observant and attending to the controls. How does one not see a fire truck and slow down?


There's a hundred years of experience in industrial automation that says that your attitude towards the driver's responsibility is unfounded, as soon as you start removing the process from them.

Re read my previous post...when something else is doing most of the thinking, once alerted the operator takes MUCH longer to be come aware of the problem, analyse and take effect.

https://www.amazon.com/Managing-Risks-Organizational-Accidents-Reason/dp/1840141050

Good book for anyone to read before blaming the operator.

Remember, the semi, Teels blamed the open area under the load for their car not "seeing it"...there's no excuse for them in this case (until they come up with one).

Are Tesla using everything they could (e.g. LIDAR), before blaming the operator, who THAY say is fully in control ?
 
Originally Posted By: Mr Nice
Originally Posted By: JohnnyJohnson
This junk shouldn't be on the road without a validation like Aircraft have to go through.

I agree 10000%

That's why the self driving trucks won't be seen on the road for the next 20 years.


Agreed.... and Harrison Ford shouldn't be flying an airplane.
 
Originally Posted By: Mr Nice
Originally Posted By: JohnnyJohnson
This junk shouldn't be on the road without a validation like Aircraft have to go through.

I agree 10000%

That's why the self driving trucks won't be seen on the road for the next 20 years.


I don't want to see the picture when the 80,000lb semi rear ends the fire truck. Or school bus. Or anything for that matter.
 
Imagine autonomous trucks in snowy conditions with a greedy owner who has had the safety measures hacked (algorithms tweaked) so his trucks will keep on keeping on when others have pulled off and gone to idle ...

Yeah, he'll get sued over an accident if someone is wise enough to grab the on-board CPU and stash it before the wreckers get there ... But the "owner" may declare bankruptcy, especially if he owns nothing and all is leased ...

We do not know where this is really going ... There is hardly a more cut-throat industry than general over the road trucking ...
 
Last edited:
Where was the driver during this? Hands have to be on the wheel for autonomous driving, maybe one hand was holding a book or a newspaper?

Or he purposely let it crash to try for a lawsuit against Tesla.
 
Status
Not open for further replies.
Back
Top