Feds investigate why a Tesla crashed into a truck Friday, killing driver

Would you trust a $40k auto-piloting car to dive you around?

The National Transportation Safety Board is dispatching a team of investigators to Delray Beach, Fla., to investigate a Friday morning crash that killed the driver of a Tesla Model 3. The National Highway Traffic Safety Administration is also investigating the crash.

According to the preliminary report of the Palm Beach County Sheriff’s office obtained by Eletrek, a semi truck was making a left turn onto SR 7 when the Model 3 crashed into it from the side. The Model 3 passed under the trailer, shearing off the top of the vehicle. The driver—who was identified by NBC News as 50-year-old Jeremy Beren Banner—died at the scene of the crash.

35-AM-640x585

The sheriff’s report indicates that the truck driver stopped at a stop sign before initiating his left-hand turn. The Tesla vehicle traveled for an additional 0.3 miles (around 500 meters) before coming to a stop.

It’s not known whether Autopilot was active at the time of the crash. A Tesla spokesperson declined to comment.

Feds have other open Tesla-related investigations

The NTSB is looking into a January 2018 crash in which a Tesla Model S crashed into a parked fire truck while Autopilot was engaged. Another NTSB investigation is probing an August 2017 crashwhere a Tesla vehicle burst into flame after running into a garage in Lake Forrest, Calif. NHTSA, meanwhile, is investigating a May 2018 crash involving a Tesla Model S in Salt Lake City, Utah. No one was killed in these crashes.

Another NTSB investigation is looking into the March 2018 death of Tesla owner Walter Huang. Autopilot was engaged when his vehicle steered into a concrete freeway lane divider.

Yesterday’s crash is similar to the 2016 crash that killed Florida Tesla owner Joshua Brown—the first Autopilot-related fatality.

In that incident, a truck made a left-hand turn in front of Brown’s vehicle. Brown had Autopilot engaged, but the software failed to recognize the side of the white truck trailer against the bright daytime sky. The Tesla Model S crashed into the side of the trailer at full speed. As in yesterday’s incident, the 2016 crash sheared off the top of Brown’s car and killed the driver almost instantly.

The NTSB ultimately concluded that Tesla bore some of the blame for that crash. NHTSA’s findings in the crash were more favorable to Tesla, but a key statistic from that report has since been discredited.

Of course they are investigating. Semi’s are regulated by the Fed’s as are experimental/prototype cars under the guise of “Interstate Commerce”.

Is this the part where we try and hold advanced tech cars to a perfect standard and ignore the fact that human drivers get into accidents every day?

1 Like

This is the part where Tesla’s governmental subsidy should be cut, bc they are getting fat and lay on free money without a real sense of competition. That is about to be different, fast.

It shouldn’t be called “autopilot” or better not be in the car at all. It’s nothing but untested alpha tech marketed as a “cool” feature for hipsters.

Good lord people, we don’t even know if the auto-pilot was engaged.

Let them investigate and publish the findings.

I’m no fan of Tesla or handing control of an automobile over to a computer taking the human out of the equation but damn, let it play out.

Meh, they probably shouldn’t be calling it autopilot but I have no problems with adaptive cruise control and lane centering. And looking at the map, it looks to me like the truck was at fault here, last time I checked, pulling out in front of someone usually puts you at fault, unless the Tesla ran a stop light or stop sign, he had the right of way.

It doesn’t take the human out of the equation, there is a human behind the wheel and he’s supposed to be paying attention.

1 Like

The whole premise is that the computer is in control so yes, the human is taken out of the equation. What they are “supposed to be doing” is irrelevant, the car is driving itself.

Balogna, is your car driving itself when you use cruise control? Mine isn’t, if yours is, you don’t know how to drive.

1 Like

The human shouldnt be snoozing. Autopilot can be disengaed with a turn of the wheel, twist of the side bar, or manual braking (ive been in a friend’s Tesla while its auto-piloting, yes). There reason that auto-pilot on the freeway is acceptable, while it is not when driving on local streets, is because autopilot cant recognize traffic light yet.

Comparing cruise control to self-driving technology is ridiculous. All cruise control does is maintain speed, all other functions, including maintaining directional control and obstacle avoidance is on the driver. A self-driving car is supposed to do everything a human driver normally does except better.

The model three, the car mentioned in the article is not a self driving car, it requires a human behind the wheel, it basically just has adaptive cruise control and lane centering. Neither of which constitutes self-driving. I am confident the manual states in no uncertain terms that a driver is required to be behind the wheel, with his hands on the wheel and alert. Oh here is the manual.

from Traffic-Aware Cruise Control - Tesla Model 3 Owner's Manual [Page 58] | ManualsLib

Traffic-Aware

Cruise Control does not eliminate the need to
watch the road in front of you and to manually
apply the brakes when needed.

Warning: Traffic-Aware Cruise Control is
designed for your driving comfort and
convenience and is not a collision warning
or avoidance system. It is your
responsibility to stay alert, drive safely,
and be in control of the vehicle at all
times. Never depend on Traffic-Aware
Cruise Control to adequately slow down

Model 3. Always watch the road in front
of you and be prepared to take corrective
action at all times. Failure to do so can
result in serious injury or death.

Warning: Although Traffic-Aware Cruise
Control is capable of detecting
pedestrians and cyclists, never depend on
Traffic-Aware Cruise Control to
adequately slow Model 3 down for them.

Always watch the road in front of you and
be prepared to take corrective action at
all times. Failure to do so can result in
serious injury or death.

In short they are driver assist features, not driver replacement features. No different than how people have been using cruise control for decades really.

1 Like

Thank you. It would have been helpful if the OP had made that distinction.

“Always watch the road in front of you and be prepared to take corrective action at all times. Failure to do so can result in serious injury or death.” seems pretty clear to me. Case closed.

“Because we can, we will.”

Such thinking has its merits among the developers of such cars. I’m all for it.

Using such cars on public roads however? Different story. Lots of pitfalls … and deaths.

Before doing that one would have to do the math - number of fatal crashes invovling auto pilot/number of cars USING (not having…but actually USING) autopilot ((might be a hard number to get)) compared to all other fatal crashes/all other cars.

Not sure how that comparison would go, but the denominator in the first term is relatively small and the denominator in the second is huge.

I don’t see the difference between crashing because you are not paying attention while driving your Tesla 3 with the traffic-aware cruise control system on vs. crashing because you were not paying attention while texting on your phone.

Like I said before though, they probably shouldn’t be permitted to call their driver assist features “autopilot”. Out of deference to stupid people that are too dumb to realize you still need to control the vehicle you are going to be held criminally and civically responsible for in the event you get into a crash. When you can legally sit in the back seat and the car company is legally responsible for the crash, then you can call it autopilot.

Yes it is. To err is human, not robotic.

There are plenty of processes that can be automated before the highway system.

So, if statistically computer controlled cars could halve traffic deaths in the US, we shouldn’t adopt them because they aren’t flawless, just twice as safe as human drivers? That doesn’t seem at all logical to me.