US carmaker Tesla has been slammed by the US road safety board after confirming details of a fatal crash involving one of its vehicles last week.
Walter (Wei) Huang, the driver of a Tesla Model X SUV, was killed on March 23 when his car hit a concrete barrier on Highway 101, which connects San Francisco with Silicon Valley. He was reportedly on his way to work at Apple.
The company has announced that the car was in autonomous mode, using Tesla’s Autopilot technology, when it hit the barrier, and that Huang’s hands were not on the wheel – as they should have been – when the accident happened.
In a blog post on Tesla’s website, the company said:
“In the moments before the collision, which occurred at 9.27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.
“The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 metres of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.
“The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.”
Tesla has been slammed by the US National Transportation Safety Board (NTSB) for releasing this information without alerting the agency beforehand, as it is required to do in a signed agreement.
The NTSB, which is still investigating the accident, said, “We take each unauthorised release seriously. However, this release will not hinder our investigation.”
Tesla has been swift to defend the safety record of its autonomous technologies after the incident, which saw the value of its shares plunge in a sell-off.
“Over a year ago, our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent. Internal data confirms that recent updates to Autopilot have improved system reliability.
“In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.
“Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians, and cyclists.”
The fatality occurred just one week after a pedestrian was killed by an autonomous Uber vehicle in Tempe, Arizona. However, it is not the first death involving a Tesla vehicle running on Autopilot. Two years ago, a driver was killed when an autonomous Tesla Model S drove into the side of a truck. It was reported that the driver may have been watching a Harry Potter movie in the vehicle at the time of the accident.
In the past Tesla has been criticised for talking about the safety of its technologies after serious accidents or fatalities. It addressed this point in its blog post, saying: “In the past, when we have brought up statistical safety points, we have been criticised for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth.
“We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.”
Internet of Business says
This latest fatality puts US regulators in a difficult position. While Arizona authorities took Uber’s self-driving cars off the road after a pedestrian was killed by one during an autonomous test, this latest fatality involves a technology, Autopilot, that is already built into production models.
The accident reveals the core problem with driverless technologies at present: in the two most recent fatalities, the general thrust of arguments has been to imply that the human drivers were at fault for either not looking at the road or not having their hands on the wheel – a logical absurdity with autonomous technologies.
Waymo CEO John Krafcik explained the distinction. “Tesla has driver-assist technology and that’s very different from our approach. If there’s an accident in a Tesla, the human in the driver’s seat is ultimately responsible for paying attention.”
Nevertheless, in both recent fatalities the technologies were driving the cars, regardless of whether the human drivers should have been paying more attention. This cannot be ignored.
Huang’s brother Will told ABC7 news that Walter had complained “Seven to 10 times that the car would swivel toward that same exact barrier during autopilot. Walter took it into dealership addressing the issue, but they couldn’t duplicate it there.”
The core question, then, is simple: should the developers of a technology that is still in its infancy seek to blame human drivers for every death? Questions like this will become increasingly commonplace as AI and autonomous systems become more dominant in our lives, calling into question longstanding legal concepts, such as liability, and ethical concepts, such as responsibility.
The subtext, therefore, is all about trust: human drivers need to trust autonomous technologies, but doing so makes them focus on things other than the road. To suggest that human drivers should concentrate on the road and the wheel while their vehicles are in autonomous mode is tantamount to suggesting that they shouldn’t trust the technology.
As we move towards completely autonomous systems, including driverless trucks and road vehicles that are designed purely for passengers, the law urgently needs to catch up.
The post Tesla slammed by safety board after latest autonomous fatality appeared first on Internet of Business.