After a thorough investigation into the 2018 fatal Tesla crash, the U.S. National Transportation Safety Board has found that Teslaâs Autopilot driver assistance system was one of the probable causes of a fatal 2018 crash into a concrete barrier. The NTSB shared the findings after a nearly 2-year investigation.
NTSB investigators also named a number of contributing factors, including that the crash attenuator in front of the barrier was damaged and had not been repaired by Californiaâs transportation department, Caltrans, in a timely manner. Had the crash attenuator been replaced, NTSB investigators asserted that the driver, Walter Huang, would have, in all likelihood, survived.
In addition, the safety board investigation found that Walter Huang, a 38-year-old Apple software engineer and game developer, was playing a mobile game while using Autopilot before the crash and made no attempts to stop his vehicle as it sped towards a crash barrier before the 2018 crash. The investigators also determined he was overly confident in Autopilotâs capabilities.
Huangâs Tesla Model X was in âAutopilotâ mode and traveling at about 70 miles per hour when it crashed into a safety barrier and was struck by two other vehicles. He succumbed to his injuries a few days later.
âIf you own a car with partial automation, you do not own a self-driving car. So donât pretend that you do,â said the NTSB chairman, Robert Sumwalt. âThis means that when driving in the supposed self-driving mode you canât sleep. You canât read a book. You canât watch a movie or TV show. You canât text. And you canât play video games. Yet thatâs precisely what we found that this driver was doing.â
âIn this crash, we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss,â NTSB chairman Robert Sumwalt said at the end of the hearing on Tuesday. âWe urge Tesla to continue to work on improving their Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken where necessary. Itâs time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.â
The NTSB shared its findings at the end of a three-hour-long hearing on Tuesday. During the hearing, board members blasted Teslaâs approach to palliate the misuse of Autopilot, the National Highway Traffic Safety Administrationâs callous approach to regulating partial automation technology, and Apple â Huangâs employer â for not having a robust driving policy. (Huang was playing the mobile game on a company-issued iPhone.)
Investigators recapped the crash during Tuesdayâs hearing, presenting evidence that the NTSBÂ made public last week. At the end of the hearing, and after a few hours of questions from the five members of the safety board, the team of investigators presented 23 findings, and made nine new safety recommendations, in addition to naming the probable causes.
The board also took aim at government regulators who he said have given âscant oversightâ of the industry as it develops self-driving technologies and noted that Tesla had yet to respond to its safety recommendations which he said had been sent to the company 881 days ago âand we have heard nothingâ.
The NTSB has previously criticized Teslaâs Autopilot after a 2016 fatal crash in Florida and called on the company to make its driver-assist systems more resilient to misuse by inattentive drivers.
Tesla, in its defense, has claimed its internal data shows that drivers using Autopilot crash less frequently while using the technology than while driving manually.