Flash News
No posts found

Tesla is partly to blame for last year’s fatal crash, says safety agency

Over a year since the first fatal crash involving a Tesla on Autopilot mode, a safety agency has said the electric car company must shoulder some of the blame for the incident.

The crash occurred on 7 May 2016, in Williston, Florida, and led to the death of 40-year-old Joshua Brown. Tesla claims that the Model S’s sensors failed to pick up the presence of “the white side of the tractor trailer against a brightly lit sky,” so did not apply the brakes when it drove across the car’s path.

While it has come to light that the driver of the car was misusing the self-driving system and not adhering to Tesla’s guidelines, an independent regulator has suggested the company should not have made features easy to abuse.

“The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” said Robert Sumwalt, chairman of the National Transportation Safety Board (NTSB), the federal agency that investigates vehicle crashes. The comments were made at a meeting on Tuesday, when the agency met to review the incident.

“Tesla allowed the driver to use the system outside of the environment for which it was designed, and the system gave far too much leeway to the driver to divert his attention to something other than driving,” Sumwalt said in his closing statement.

“The result was a collision that should not have happened. System safeguards were lacking.”

A report that came out in January, after an investigation by the National Highway Traffic and Safety Administration (NHTSA) appeared to clear Tesla of blame.

“The driver took no braking, steering, or other actions to avoid the collision,” the report noted, pointing out the driver interacted with the car just seven times in 37 minutes, with his last recorded action to set the cruise control to 74mph.

“NHTSA’s crash reconstruction indicates that the tractor trailer should have been visible to the Tesla driver for at least seven seconds prior to impact.” It said the company’s Autopilot-enabled vehicles did not need to be recalled.

But that report was only looking into the Automatic Emergency Braking (AEB) and Autopilot technology and any potential flaws, not the responsibility of the company for the incident.

Tesla has always made it clear the autopilot mode is not completely autonomous. Drivers are told they must maintain control and responsibility of their vehicle when using the mode.

“We are inherently imperfect beings, and automated systems can help compensate for that,” says David Friedman, who once ran NHTSA. “In this case, though, there was a glaring human error and the system made no attempt to compensate for that other than to warn the driver.”

Since the incident Tesla’s Autopilot system has been updated so that the driver can’t ignore warnings to hold the steering wheel. After the update, if a driver is given three warnings and still doesn’t respond, the car will slow to a stop.

As of last October, every new Tesla made will contain brand-new autonomous driving hardware as part of a big improvement to the company’s Autopilot technology.