Tesla driver's family doesn't blame vehicle for his death

Tesla driver's family doesn't blame vehicle for his death

Tesla driver's family doesn't blame vehicle for his death

The "operational limitations" of Tesla's Autopilot system played "major role" in 2016 crash, said the National Transportation Safety Board on Tuesday. Although that system works like most other adaptive cruise control and lane keeping "Level 2" semi-autonomous driving systems offered by other OEMs, Tesla's Autopilot differs in that it allowed the driver to go much, much longer without interacting with the auto.

The limits on the system include factors such as Tesla being unable to ensure driver attention even when the vehicle is traveling at high speeds, ensuring Autopilot is used only on certain roads and monitoring driver engagement, NTSB said.

Tesla said it was reviewing the agency's recommendations, according to Reuters. Investigators later found that Brown's hands were on the steering wheel for 25 seconds of the 37 minutes the vehicle was in Autopilot, and the car's operating system warned Brown seven times to place his hands back on the wheel before hitting the truck.

The agency said the Autopilot system operated as designed but did not do enough to ensure drivers paid adequate attention. But it didn't incorporate protections against their use on other types of roads, the board found. The NTSB suggested to any maker of semi-autonomous vehicles to prevent the use of the technology on roads where the vehicles aren't suited to travel without human control of the vehicle.

"We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times", Tesla said.

The NTSB is also expected to find that Tesla could have taken additional steps to prevent the system's misuse and fault the driver of the Tesla.

The system could not reliably detect cross traffic and "did little to constrain the use of autopilot to roadways for which it was designed", the board said. Other recommendations centered around data collection and designs for determining whether drivers are actually paying attention behind the wheel.

Monitoring driver attention by measuring the driver's touching of the steering wheel "was a poor surrogate for monitored driving engagement", said the board.

Rackspace Goes Shopping, Set to Acquire Datapipe
He also noted that the fact that Rackspace is now a private company again , with a single owner, allowed it to go for this deal. About DatapipeA next generation MSP, Datapipe is recognized as the pioneer of managed services for public cloud platforms.

"The probability of having an accident is 50% lower if you have Autopilot on", said Musk at a 2016 energy conference in Oslo, Norway.

Brown's family defended his actions and Tesla in a statement released Monday.

"We heard numerous times that the vehicle killed our son", the family said. "That is simply not the case", said the statement from the family, breaking its silence on the crash.

As NHTSA found, the Automatic Emergency Braking feature on the Tesla-in common with just about every other AEB fitted to other makes of cars-was not designed in such a way that it could have saved Brown's life. "People die every day in auto accidents".

Joshua Brown, a 40-year-old OH man, was killed near Williston, Florida, when his Model S collided with a truck while it was engaged in the "Autopilot" mode.

Even though Brown's Model S warned him seven times during the 37 minutes before the crash that his hands weren't on the steering wheel, he was able to briefly touch the wheel and the system continued driving itself, according to the NTSB. The NHTSA also found that crash rates for Tesla vehicles had dropped by 40% since Autopilot was first installed.

While the NTSB praised Tesla for making improvements in its technology since the crash, it said that the system still gave drivers too much leeway to activate the automation in conditions where it might be unsafe.

Related news