Tesla slammed by safety board after latest autonomous fatality

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

US carmaker Tesla has been slammed by the US road safety board after confirming details of a fatal crash involving one of its vehicles last week.

Walter (Wei) Huang, the driver of a Tesla Model X SUV, was killed on March 23 when his car hit a concrete barrier on Highway 101, which connects San Francisco with Silicon Valley. He was reportedly on his way to work at Apple.

The company has announced that the car was in autonomous mode, using Tesla’s Autopilot technology, when it hit the barrier, and that Huang’s hands were not on the wheel – as they should have been – when the accident happened.

In a blog post on Tesla’s website, the company said:

“In the moments before the collision, which occurred at 9.27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.

“The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 metres of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

“The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.”

Tesla has been slammed by the US National Transportation Safety Board (NTSB) for releasing this information without alerting the agency beforehand, as it is required to do in a signed agreement.

The NTSB, which is still investigating the accident, said, “We take each unauthorised release seriously. However, this release will not hinder our investigation.”

Increased safety

Tesla has been swift to defend the safety record of its autonomous technologies after the incident, which saw the value of its shares plunge in a sell-off.

“Over a year ago, our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent. Internal data confirms that recent updates to Autopilot have improved system reliability.

“In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

“Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians, and cyclists.”

The fatality occurred just one week after a pedestrian was killed by an autonomous Uber vehicle in Tempe, Arizona. However, it is not the first death involving a Tesla vehicle running on Autopilot. Two years ago, a driver was killed when an autonomous Tesla Model S drove into the side of a truck. It was reported that the driver may have been watching a Harry Potter movie in the vehicle at the time of the accident.

In the past Tesla has been criticised for talking about the safety of its technologies after serious accidents or fatalities. It addressed this point in its blog post, saying: “In the past, when we have brought up statistical safety points, we have been criticised for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth.

“We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.”

Read more: Uber: Self-driving cars ordered off road by US, sells to Grab

Read more: Toyota halts autonomous car tests after Uber accident

Internet of Business says

This latest fatality puts US regulators in a difficult position. While Arizona authorities took Uber’s self-driving cars off the road after a pedestrian was killed by one during an autonomous test, this latest fatality involves a technology, Autopilot, that is already built into production models.

The accident reveals the core problem with driverless technologies at present: in the two most recent fatalities, the general thrust of arguments has been to imply that the human drivers were at fault for either not looking at the road or not having their hands on the wheel – a logical absurdity with autonomous technologies.

Waymo CEO John Krafcik explained the distinction. “Tesla has driver-assist technology and that’s very different from our approach. If there’s an accident in a Tesla, the human in the driver’s seat is ultimately responsible for paying attention.”

Nevertheless, in both recent fatalities the technologies were driving the cars, regardless of whether the human drivers should have been paying more attention. This cannot be ignored.

Huang’s brother Will told ABC7 news that Walter had complained “Seven to 10 times that the car would swivel toward that same exact barrier during autopilot. Walter took it into dealership addressing the issue, but they couldn’t duplicate it there.”

The core question, then, is simple: should the developers of a technology that is still in its infancy seek to blame human drivers for every death? Questions like this will become increasingly commonplace as AI and autonomous systems become more dominant in our lives, calling into question longstanding legal concepts, such as liability, and ethical concepts, such as responsibility.

The subtext, therefore, is all about trust: human drivers need to trust autonomous technologies, but doing so makes them focus on things other than the road. To suggest that human drivers should concentrate on the road and the wheel while their vehicles are in autonomous mode is tantamount to suggesting that they shouldn’t trust the technology.

As we move towards completely autonomous systems, including driverless trucks and road vehicles that are designed purely for passengers, the law urgently needs to catch up.

Read more: New Baidu, Jaguar Land Rover driverless cars take to the road

Read more: Waymo turns the ignition on self-driving trucks

Read more: Fetch launches world’s first autonomous AI smart ledger

Read more: Pure Storage, NVIDIA launch enterprise AI supercomputer in a box

Read more: AI regulation & ethics: How to build more human-focused AI

Read more: Cambridge Analytica vs Facebook: Why AI laws are inadequate

 

The post Tesla slammed by safety board after latest autonomous fatality appeared first on Internet of Business.

Internet of Business

Cash For Apps: Make money with android app

Tesla Model X in Autopilot Killed a Driver. Officials Aren’t Pleased With How Tesla Handled It.

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Updated 3PM ET

Tesla is taking PR very seriously after one of its vehicles in autonomous mode killed a passenger recently.

The crash occurred at 9:27 AM on Highway 101 near Mountain View, California. Walter Huang was in the driver’s seat of the Model X, which was in autonomous mode. The car hit a concrete highway divider, marked with black and yellow chevrons, at full force. Huang didn’t take any action. The SUV crumpled like a tin can, and Huang didn’t make it.

Other information has been hard to come by, due to the severity of the damage. So far we don’t know if his death was a result of negligence, a fatal nap, or simply being distracted by the fireworks of warning lights, and sounds. But one thing is clear: the crash proves that audio and visual cues on the dashboard could after all be insufficient to prevent a crash.

Huang wasn’t the first to die in a Tesla with Autopilot active. In 2016, Joshua Brown crashed his Model S into a truck, marking fatal collision while Autopilot was engaged.

The timing for this particular crash isn’t exactly ideal (from Tesla’s perspective). Uber is already doing damage control after its self-driving car killed a pedestrian in Arizona on March 19, four days before Huang’s fatal collision.

Interestingly, officials aren’t too pleased about Tesla’s PR offensive. On Sunday, a spokesperson for the U.S. National Transportation Safety Board (NTSB) told the Washington Post:

At this time the NTSB needs the assistance of Tesla to decode the data the vehicle recorded. In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla.

Presumably, investigators aren’t happy because they’d like to get as much information as they can, then release a report.

But Tesla might have jumped the gun. Not complying with the NTSB’s investigation processes and deadlines might end up having their technological advancements (and security improvements) screech to a halt.

After the Uber car’s crash, the company was banned from further testing in Arizona (though other companies were allowed to continue). Many people feared that the crash would fray the public’s trust in autonomous vehicles, and that largely has not come to pass, at least not yet.

But if the crashes continue, that could change. The market for autonomous cars could dry up before the technology becomes reliable enough to make them widespread.

Tesla’s Autopilot is Level 2 autonomy, while Uber’s self-driving car is a Level 4. So the technology isn’t even really the same. Still, a turn in the tide of public opinion could sweep both up with it.

Autonomous vehicles aren’t the best at sharing the unpredictable road with imprecise humans. Yes, once fully autonomous vehicles roll out all over the country and make up 100 percent of the vehicles on the road, American roads will inevitably become safer.

But we’re not there yet. If crashes like these keep happening, and the public loses trust, we might never be.

Update: Tesla CEO Elon Musk took to Twitter to respond to comments from NTSB and reiterate Tesla’s priorities:

The post Tesla Model X in Autopilot Killed a Driver. Officials Aren’t Pleased With How Tesla Handled It. appeared first on Futurism.

Futurism

Cash For Apps: Make money with android app

Tesla Driver Nearly Crashes Recreating Fatal Autopilot Accident

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Last month, an Apple engineer died when his Tesla Model X crashed into a safety barrier while driving on Autopilot in Mountain View, California. Local authorities, the National Transportation Safety Board (NTSB), and Tesla are all investigating the fatal accident. But another Tesla owner in Chicago decided to try his own investigation of the Autopilot […]
Read More…
iDrop News
Cash For Apps: Make money with android app

Federal investigators ‘unhappy’ Tesla revealed crash details

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

The US National Transportation Safety Board (NTSB) said it is "unhappy" that Tesla released information about the fiery March 23rd crash that killed a driver. In a blog post last Friday, Elon Musk said that the Autopilot was active when the Model str…
Engadget RSS Feed
Cash For Apps: Make money with android app

Tesla puts Model 3 Autopilot controls on the steering wheel

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Tesla has rectified one of the biggest Model 3 issues that cropped up during early reviews from Engadget and others. Until now, operating key vehicle functions like the Autopilot required tapping on the center display, effectively pulling the driver'…
Engadget RSS Feed
Cash For Apps: Make money with android app

Tesla: Autopilot was engaged in fatal Model X crash

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

After a fiery crash in Mountain View, CA last week killed the driver of a Tesla Model X, the company provided an update on the incident with a blog post. It did not name the driver, identified by ABC 7 News as Apple engineer and former EA programmer…
Engadget RSS Feed
Cash For Apps: Make money with android app

Tesla says Autopilot was engaged during fatal Model X crash

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Tesla says Autopilot was engaged at the time of a deadly Model X crash that occurred March 23rd in Mountain View, California. The company posted a statement online late Friday, after local news reported that the victim had made several complaints to Tesla about the vehicle’s Autopilot technology prior to the crash in which he died.

After recovering the logs from the crash site, Tesla acknowledged that Autopilot was on, with the adaptive cruise control follow distance set to a minimum. The company also said that the driver, identified as Apple engineer Wei “Walter” Huang, had his hands off the steering wheel and was not responding to warnings to re-take control.

The driver had received several visual and one audible hands-on warning…

Continue reading…

The Verge – All Posts

Cash For Apps: Make money with android app

Tesla recall covers 123,000 pre-April 2016 Model S EVs

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

While Tesla's current-day Model 3 production tries to keep up with demand, the company is dealing with an issue affecting earlier cars. Today it announced a recall affecting every Model S built before April 2016. That adds up to about 123,000 vehicle…
Engadget RSS Feed
Cash For Apps: Make money with android app

Apple Engineer Who Died in Tesla Crash Complained About Its Software

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

An Apple engineer who died last Friday, when the new Tesla Model X he was driving crashed into a barrier, may have brought up concerns about the vehicle’s Autopilot functionality previously. Walter Huang, 38, died in the accident in Mountain View, California when he was on his way to work at Apple. The Tesla Model […]
Read More…
iDrop News
Cash For Apps: Make money with android app

Tesla issues its largest recall ever over faulty Model S steering

How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND

Tesla said Thursday it was recalling a huge number of its Model S sedans around the world over a power steering issue. It told customers in an email that it was a proactive move and none of the company’s other vehicles were affected.

The automaker said 123,000 Model S vehicles built before April 2016 were affected. No injuries or crashes have been reported in connection with the problem. Before today, its largest Model S recall was when 90,000 of the vehicles were affected in 2015 by a faulty seat belt. And last year, it recalled 53,000 Model S and Model Xs over a parking brake fault.

In the email, Tesla said it had, “observed excessive corrosion in the power steering bolts,” but that the problem was most prevalent in colder climates…

Continue reading…

The Verge – All Posts

Cash For Apps: Make money with android app