Tesla

After a thorough investigation into the 2018 fatal Tesla crash, the U.S. National Transportation Safety Board has found that Tesla’s Autopilot driver assistance system was one of the probable causes of a fatal 2018 crash into a concrete barrier. The NTSB shared the findings after a nearly 2-year investigation.

NTSB investigators also named a number of contributing factors, including that the crash attenuator in front of the barrier was damaged and had not been repaired by California’s transportation department, Caltrans, in a timely manner. Had the crash attenuator been replaced, NTSB investigators asserted that the driver, Walter Huang, would have, in all likelihood, survived.

In addition, the safety board investigation found that Walter Huang, a 38-year-old Apple software engineer and game developer, was playing a mobile game while using Autopilot before the crash and made no attempts to stop his vehicle as it sped towards a crash barrier before the 2018 crash. The investigators also determined he was overly confident in Autopilot’s capabilities.

Huang’s Tesla Model X was in “Autopilot” mode and traveling at about 70 miles per hour when it crashed into a safety barrier and was struck by two other vehicles. He succumbed to his injuries a few days later.

“If you own a car with partial automation, you do not own a self-driving car. So don’t pretend that you do,” said the NTSB chairman, Robert Sumwalt. “This means that when driving in the supposed self-driving mode you can’t sleep. You can’t read a book. You can’t watch a movie or TV show. You can’t text. And you can’t play video games. Yet that’s precisely what we found that this driver was doing.”

“In this crash, we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss,” NTSB chairman Robert Sumwalt said at the end of the hearing on Tuesday. “We urge Tesla to continue to work on improving their Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken where necessary. It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.”

The NTSB shared its findings at the end of a three-hour-long hearing on Tuesday. During the hearing, board members blasted Tesla’s approach to palliate the misuse of Autopilot, the National Highway Traffic Safety Administration’s callous approach to regulating partial automation technology, and Apple — Huang’s employer — for not having a robust driving policy. (Huang was playing the mobile game on a company-issued iPhone.)

Investigators recapped the crash during Tuesday’s hearing, presenting evidence that the NTSB made public last week. At the end of the hearing, and after a few hours of questions from the five members of the safety board, the team of investigators presented 23 findings, and made nine new safety recommendations, in addition to naming the probable causes.

The board also took aim at government regulators who he said have given “scant oversight” of the industry as it develops self-driving technologies and noted that Tesla had yet to respond to its safety recommendations which he said had been sent to the company 881 days ago “and we have heard nothing”.

The NTSB has previously criticized Tesla’s Autopilot after a 2016 fatal crash in Florida and called on the company to make its driver-assist systems more resilient to misuse by inattentive drivers.

Tesla, in its defense, has claimed its internal data shows that drivers using Autopilot crash less frequently while using the technology than while driving manually.