Tesla Says Autopilot Was Engaged in Fatal Model X Crash
After a fiery crash in Mountain View, Calif., last week killed the driver of a Tesla Model X, the company provided an update on the incident with a blog post. It did not name the driver, identified by ABC 7 News as Apple engineer and former EA programmer Walter Huang, while confirming that its Autopilot feature was engaged at the time of the crash.
"In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver's hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."
It reiterated a statement from its first update, noting that a crash attenuator safety barrier intended to mitigate the effect of a collision in that spot was missing due to an earlier crash. While it noted that the driver had time and unobstructed view of the divider before the crash, it didn't mention what action the automated systems took at the time. The driver's brother told a reporter that Huang had previously complained the car would swivel toward that exact barrier and had complained to the Tesla dealership about it, but that they could not replicate the issue.
Tesla also went on to cite stats from the government showing Autopilot reduced crash rates by 40 percent, and suffered far fewer fatalities per mile than other cars.
"In the U.S., there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.
"Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.
"No one knows about the accidents that didn't happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars."
The National Transportation Safety Board already said it's investigating the crash, to look at the fire and steps to make the car safe for removal from the scene. However, Bloomberg reported that it's also looking into another crash earlier this year where a Model S ran into a fire truck parked on the highway, where the driver claimed Autopilot was engaged at the time. In 2016 a driver blamed Autopilot for a crash on the Pennsylvania Turnpike, but the company said data showed the feature was not in use at the time. And last year the NTSB found fault with both driver and technology in the case of a fatal Florida crash.
For autonomous or driver assist technology, the scrutiny is raised not only because of those incidents, but also the recent self-driving Uber crash in Arizona. Expect to hear more questions coming up about the testing of this technology and how well it handles returning control to a human driver when there's a situation the sensors aren't ready for.