The National Transportation Safety Board, which is currently investigating last month’s fatal crash involving Tesla’s Autopilot system, has removed the electric automaker from the case after it improperly disclosed details of the investigation.
Since nothing can ever be simple, Tesla Motors claims it left the investigation voluntarily. It also accused the NTSB of violating its own rules and placing an emphasis on getting headlines, rather than promoting safety and allowing the brand to provide information to the public. Tesla said it plans to make anÂ official complaint to Congress on the matter.
The fallout came after the automaker disclosed what the NTSB considered to be investigative information before it was vetted and confirmed by the investigative team. On March 30th, Tesla issued a release statingÂ the driver had received several visual and one audible hands-on warning before the accident. It also outlined items it believed attributed to the brutality of the crash and appeared to attribute blame to the vehicle’s operator.Â The NTSB claims any release of incomplete information runs the risk of promoting speculation and incorrect assumptions about the probable cause of a crash, doing a “disservice to the investigative process and the traveling public.”
While it’s understandable that an automaker would want to divert negative attention away from itself, the decision to disclose details about the crash led directly to the National Transportation Safety Board cutting ties with Tesla.
“It is unfortunate that Tesla, by its actions, did not abide by the party agreement,” said NTSB Chairman Robert Sumwalt on Thursday. “We decided to revoke Teslaâ€™s party status and informed Mr. Musk in a phone call last evening and via letter today. While we understand the demand for information that parties face during an NTSB investigation, uncoordinated releases of incomplete information do not further transportation safety or serve the public interest.”
“There is nothing in the party agreement that prevents a company from enacting swift and effective measures to counter a threat to public safety,” Sumwalt continued. “We continue to encourage Tesla to take actions on the safety recommendations issued as a result of our investigation of the 2016 Williston, Florida, crash.”
The organization also released a letter addressed to Tesla CEO Elon Musk, explaining why it decided to revoke the automaker’sÂ party status, and noted a prior phone call of a similar nature.
Tesla fired back, saying the NTSB doesn’t adhere to its own rules and tellingÂ CNBCÂ that it withdrew from the NTSB investigation by its own action:
“Last week, in a conversation with the NTSB, we were told that if we made additional statements before their 12-24 month investigative process is complete, we would no longer be a party to the investigation agreement. On Tuesday, we chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot â€” claims which made it seem as though Autopilot creates safety problems when the opposite is true. In the US, there is one automotive fatality every 86 million miles across all vehicles. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident and this continues to improve.”
“It’s been clear in our conversations with the NTSB that they’re more concerned with press headlines than actually promoting safety. Among other things, they repeatedly released partial bits of incomplete information to the media in violation of their own rules, at the same time that they were trying to prevent us from telling all the facts. We don’t believe this is right and we will be making an official complaint to Congress. We will also be issuing a Freedom Of Information Act request to understand the reasoning behind their focus on the safest cars in America while they ignore the cars that are the least safe. Perhaps there is a sound rationale for this, but we cannot imagine what that could possibly be.”
The statement goes on to suggest theÂ National Highway Traffic Safety Administration found that an earlier version of Tesla’s Autopilot resulted in 40 percent fewer crashes (which we could not confirm) and notes it is the presiding regulatory body for automobilesÂ â€” not the NTSB. This is technically true; theÂ National Transportation Safety Board is an independent U.S. government investigative agency. However, it does make recommendations based on its findings and has served as an advisor for other regulatory groups in the past.
Regardless, with its agreement with the NTSB now broken, Tesla can now say whatever it wants about the accident while the investigation continues in relative silence. The automaker clearly doesn’t want to come across as looking irresponsible in the aftermath of this tragic incident and has already made a soft defense for itself. We imagine the NTSB will reach a conclusion similar to what it found in the Florida Autopilot crash â€” saying the vehicle’s operational limits played a major role while emphasizing driver responsibility.
There might also be some discussion of how semi-autonomous features are being marketed to consumers. Right now, numerous manufacturers are mobilizing legal teams to address the building pressure to safely employ autonomous features and electronic driving aids. We’ve repeatedly mentioned how these features allow motorist to tune out and put undeserved trust into systems that simply aren’t ready to do all the driving. But the one-two punch of fatal crashes involving Uber and Tesla vehicles has brought the issue under the microscope for the rest of the nation.
It’s worth noting that Tesla is by no means the only manufacturer at risk here. Any company deploying advanced driving aids that allow the vehicle to maneuver itself can fall into the trap. Deciding who to blame in the event of a crash is another story. Ultimately, a vehicle’s operator is responsible for safety, but the existence of cars that can steer and stop themselves really complicates things. If a carmaker bills its technology as “able to drive itself” or even hints at it, it could be liable when things play out poorly.
On Friday, attorneys for the domestic arms of Volkswagen, Toyota, Hyundai, and Continental came forward to emphasize the importance of helping the public understand the limits of advanced driving aids.
“The OEMs right now are trying really hard to accurately describe what this equipment can do and canâ€™t do,” said Tom Vanderford, associate general counsel at Hyundai, at an American Bar Association conference in Phoenix, Arizona.
We’re elated to hear automakers addressing these concerns, but we also wonder how much good it will do. Bolstering the public’s understanding of these systems and adding safety protocols that force human involvement are great. But it doesn’t stop the growing assumption that regular use of these aids probably degrades a person’s driving skills, and the problems that entails.
Back in 2016, longtime automotive analyst Maryann Keller suggested the automation of the automotive industry could mimic what happened with aviation. Fatal airplane crashes declined dramatically since the 1970s, but pilots’ growing reliance on automated systems created entirely new problems. As a result, the Federal Aviation Administration released a 2013Â Safety Alert for Operators that warnedÂ “continuous use of [autoflight] systems does not reinforce a pilotâ€™s knowledge and skills in manual flight operations.” The alert went on to say that regular use of such systems could “lead to degradation of the pilotâ€™s ability to quickly recover the aircraft from an undesired state.”
Keller claimed motorists would face similar troubles when driving aids fail or are incapable of mitigating certain situations (bad weather, poorly marked roads, system failures, etc.). She also said potential distractions are likely to increase as these systems become more popular.Â “Incorporating electronic interfaces within the car for phone calls, texting, or entertainment will begin to occupy more attention from drivers in all levels of automated vehiclesÂ â€” and already the dangers of such distraction are well known,” Keller said.
Things certainly seem to be heading in that direction. Numerous manufacturers are dabbling with in-car marketing and center touch screens now resemble smartphones in both form and function. If you don’t have the self-control to use it responsibly, automotive multimedia posses a serious potential for distracted driving. Semi-autonomous driving systems only exacerbate the problem.
In the long run, our biggest gripes with vehicular autonomy will probably revolve around the lost art of driving and how automakers have warped cars into mobile computers. Safety will continue to improve as refinement grows and Level 5 autonomy becomes a reality. Still, we could be in for a bumpy ride as everyone attempts to figure out how to drive, or market (in the case of automakers), vehicles utilizing technology thatÂ corrupts a driver’s skills but doesn’tÂ have the ability to surpass them.