2017-01-20



-

Federal regulators spent the past six months investigating the role of Tesla Motors’ Autopilot feature in a fatal car crash. Their findings reinforce what millions of drivers already know: Despite much hype about “self-driving cars,” human beings remain responsible for understanding the capabilities and limitations of the vehicles they drive and accountable for their safe operation.

-

Officials from the National Highway Traffic Safety Administration (NHTSA) closed their probe of Tesla on January 19 without ordering a recall or taking any enforcement action. The examination found no faults within the Autopilot system and determined that it worked as intended during a May 7, 2016, crash that claimed the life of Joshua Brown, the first person killed in a crash attributable to a semi-autonomous feature.

-

“Not all systems can do all things,” NHTSA spokesman Bryan Thomas said Thursday.

-

Brown had engaged the Autopilot feature in his 2015 Tesla Model S as he traveled eastward along U.S. Highway 27 in Williston, Florida. Neither he nor the autonomous technology noticed when a tractor trailer made a left turn across the car’s path. The truck should have been visible to the driver for at least seven seconds before the fatal collision occurred, according to NHTSA’s summary of the investigation, enough time to notice that the car was not reacting to the hazard and to take evasive action.

-



-

In September, Tesla made changes to its Autopilot feature that emphasize the role of radar and cameras in object detection, improvements that company president Elon Musk said he believed could have prevented Brown’s death by making the car capable of detecting the obstacle ahead of it in time to avoid or mitigate the collision. Tesla’s changes also shortened the period of time during which drivers could remove their hands from the steering wheel. Currently, drivers who do not respond to cues to keep their hands on the wheel three times can lose Autopilot functionality for the remainder of a journey.

-

But Thomas made clear that—even if Tesla had not upgraded its Autopilot system—no recall would have been ordered for the 43,781 Model S and Model X vehicles that contain Autopilot, because no defects had been found during the investigation.

-

Employees of NHTSA’s Office of Defects Investigation reviewed “dozens” of Tesla-involved crashes of Model S or Model X vehicles in which Autopilot was either engaged or had been engaged within the 15 seconds preceding a collision. Only two of these crashes resulted in a death or serious injury, with the latter being a rollover on the Pennsylvania Turnpike on July 1, 2016, in which two people were seriously injured.

-
-

-
“The very name Autopilot creates the impression
-that a Tesla can drive itself. It can’t.”
-– John Simpson, Consumer Watchdog
-
-

-
-

As NHTSA scrutinizes the role of the technology in these crashes, it was also quick to note the promise of improved safety provided by semi-autonomous systems. The agency says that in reviewing data furnished by Tesla as part of the investigation, it found that vehicles equipped with Autosteer, a component of the Autopilot system, reduced crash rates by 40 percent from their pre-installation levels. Before Autosteer, the vehicles had a rate of 1.3 crashes per million miles of travel; afterward, the rate fell to 0.8 per million miles.

-

“We appreciate the thoroughness of NHTSA’s report and its conclusion,” a Tesla spokesperson said in a written statement.

-

The Florida crash and its circumstances encapsulate many thorny issues that industry engineers and government regulators are grappling with both in the near term, as driver-assistance features spread across the nation’s fleet, and further down the road, during a transition toward more highly automated driving. Among those challenges: figuring out how motorists and machines exchange control while avoiding “mode confusion,” and ensuring that motorists understand how these systems work while safeguarding against misuse.

-



-

John Simpson, privacy director at Consumer Watchdog, a California-based nonprofit that advocates for customer rights, says there’s too much blaming of the driver in NHTSA’s findings.

-

“NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the ‘Autopilot’ technology and Tesla’s aggressive marketing,” he said. “The very name ‘Autopilot creates the impression that a Tesla can drive itself. It can’t. Some people who apparently believed Tesla’s hype got killed.”

-

While Thursday’s findings lay the brunt of responsibility on the person behind the wheel, that does not mean automakers are off the hook.

-

NHTSA, which issued a Federal Automated Vehicles Policy in September, criticized Tesla and other companies for marketing slogans and brand names of features, such as Autopilot, that may misrepresent their systems’ capabilities. Under the agency’s definitions of autonomy, Autopilot is classified as a Level 2 technology, one in which an automated system can conduct some parts of the driving task while humans monitor the broader driving environment and perform remaining driving tasks.

-
-

-
“The department has been leaning forward on automated technologies because we believe they have great potential
-to save lives.” – Bryan Thomas, NHTSA
-
-

-
-

It’s not enough, Thomas said, for automakers to describe the system operations in an owner’s manual, and NHTSA’s investigation found Tesla’s manual was “not as specific as it could be.” But it’s also not enough for automakers to assume drivers will use features as they’re intended. NHTSA says they must account for how customers could potentially misuse semi-autonomous technology.

-

Broadly, NHTSA has taken a keen interest in the exchange of control between motorists and semi-autonomous systems.

-

In October, the agency sent a letter to Comma.ai stating that its aftermarket product would put “the safety of your customers and other road users at risk.,” The company asserted that its Comma One—a device intended to make autonomous-driving features available on cars not originally equipped with such technology—did not remove any human responsibilities from the driving task, but NHTSA said the warning was “insufficient.” The company opted to stop offering the device. In November, the agency cautioned General Motors that its plans to allow its forthcoming Super Cruise feature to stop a vehicle in the middle of roadways might present a danger to motorists and thus be considered a safety defect.

-

-

For now, Tesla, the third company with a semi-autonomous system that attracted regulatory scrutiny, has passed muster. But NHTSA will continue to monitor the technology in general, paying particular attention to issues with handoffs of control at Level 2 and Level 3 autonomy.

-

“The department has been leaning forward on automated technologies because we believe they have great potential to save lives,” Thomas said. “At the same time, the department will aggressively oversee new technologies being put on the road.”

-
-
Tesla Driver Involved in Fatal Crash While in Autopilot Mode
-
Elon Musk: Newer Autopilot Version Might Have Prevented Crash
-
Fatal Tesla Crash Won’t Slow Federal Push for Autonomous Cars
-
-

Later, Thomas said, “These are assistance systems that require the driver in the loop at all times . . . These are complicated issues, and we have ongoing research. We are interested in working with the industry in a collaborative way.”

-

Show more