An instrument panel inside a Tesla Model S P90D in September 2016 displays a warning that the driver-assistance system will be ‘unavailable for the rest of this drive.’
Long before the fatal crash of a Tesla car in March, some developers of the vehicle’s Autopilot system expressed concern there weren’t enough safeguards to ensure drivers remained attentive, people familiar with the discussions said.
Tesla Inc.’s engineers repeatedly discussed adding sensors that would ensure drivers look at the road or keep their hands on the wheel both before and after the driver-assistance system was introduced in 2015, these people said.
Tesla executives including Chief Executive Elon Musk rejected the ideas because of costs and concerns that the technology was ineffective or would annoy drivers with overly sensitive sensors that would beep too often, the people said.
“Everyone at Tesla is not only encouraged, but expected, to provide criticism and feedback to ensure that we’re creating the best, safest cars on the road,” a Tesla spokesman said in a statement. “This is especially true on the Autopilot team, where we make decisions based on what will improve safety and provide the best customer experience, not for any other reason.”
After this article was published Monday, Mr. Musk wrote on Twitter that the company rejected technology that would track drivers’ eyes because it was ineffective, not because of the cost.
Generations of auto makers have balanced adding new safety features, such as antilock brakes and backup cameras, with associated costs. Automated driving technology is developing rapidly, and auto makers are rolling it out at varying paces.
Automotive specialists and federal safety investigators have previously questioned whether Tesla has done enough to ensure safety with Autopilot, a hallmark of Tesla’s electric vehicles. Autopilot uses cameras, sensors and radar to control vehicle speed and steering under certain conditions, but doesn’t take over full control of driving. Tesla’s position is that cars with the Autopilot system are safer than cars without it.
The March 23 crash of a Tesla Model X sport-utility vehicle occurred south of San Francisco while Autopilot was activated, killing the driver. The crash is under investigation by U.S. transportation-safety agencies. Tesla said the driver received several system warnings to put his hands on the wheel and had at least five seconds to do so before the car slammed into a highway divider.
The company’s owners manuals emphasize that Autopilot has limitations, such as an inability to spot standing objects. Drivers must acknowledge on the car’s touch screen that it is their responsibility to stay alert and maintain control. Visual and audio warnings remind them to stay engaged.
Some self-driving car experts say partly autonomous driving systems like Autopilot give drivers a false sense of confidence that they can turn their attention elsewhere.
Mr. Musk alluded to the complacency issue in a May 2 call with analysts, while repeating Tesla’s view that Autopilot makes its cars safer than conventional automobiles. “When there is a serious accident, it is almost always, in fact, maybe always the case, that it is an experienced user,” Mr. Musk said. “And the issue is...more one of complacency, like we get too used to it.”
Such concerns have persisted within Tesla, people familiar with the Autopilot effort said, and intensified in 2016, after 40-year-old Joshua Brown died in Florida when his Model S sedan, with Autopilot activated, struck a semitrailer truck that was crossing a divided highway.
Tesla, worried that Autopilot gave drivers a false sense of security, brought in suppliers to discuss ways to ensure drivers pay attention, some of the people said.
One idea was sensors to track drivers’ eyes to ensure they watch the road. Tesla executives questioned the costs of such a system, which typically includes a camera and infrared sensor, and whether it would be ready for deployment, these people said. Another concern was whether the sensors could reliably detect drivers of varying heights.
Another measure the Autopilot team considered was incorporating sensors into the steering wheel to monitor whether drivers’ hands were touching it at all times, these people said.
Autopilot already has a sensor to capture small movements of the steering wheel to gauge whether drivers are holding it. But drivers can quickly touch the wheel to stop the dashboard warnings for a few minutes.
“It came down to cost, and Elon was confident we wouldn’t need it,” one of those people said. Executives conveyed there was pressure for each vehicle to reach a certain profit margin, according to the people familiar with the matter.
Tesla in 2016 was gearing up to launch the Model 3 with a starting price of $35,000, much lower than previous Tesla vehicles. Tesla has targeted a 25% gross margin for when it began production and aims to improve it over time.
“We’ve explored many technologies and opted for the combination of a hands-on-wheel torsion sensor with visual and audio alerts, and we will of course continue to evaluate new technologies as we evolve the Tesla fleet over time,” the Tesla spokesman said Thursday.
Mr. Musk has argued Tesla shouldn’t delay deployment of Autopilot given its potential to save lives. In July 2016, about two months after Mr. Brown’s crash, Mr. Musk wrote it would be “morally reprehensible” to move slowly because of media scrutiny or legal liability.
In September 2016, Tesla wirelessly updated the Autopilot software in its cars. Among other improvements, it implemented a protocol that disables the system after warning a driver three times over several minutes without result.
Federal safety investigators in 2017 found that Mr. Brown put his hands on the wheel for a total of 25 seconds during the 37 minutes Autopilot was on, and received 13 warnings to keep his hands on the wheel. Autopilot never deactivated.
The National Transportation Safety Board said Autopilot lacked “an effective method of ensuring driver engagement” by allowing drivers to ignore warnings and keep their hands off the wheel for up to five minutes at a time. It also said Tesla’s steering-sensor system doesn’t ensure a driver is watching the road.
“Autopilot as implemented now allows too much leeway,” enabling drivers to keep their hands off the wheel for as long as two minutes before being warned, said Phil Magney, founder of automotive consultancy VSI Labs.
General Motors Co. , concerned that drivers would misuse its Super Cruise semiautonomous system, had delayed rolling it out for about a year, until it made its debut in the 2018 Cadillac CT6 sedan, people familiar with the effort said. That system includes eye-tracking technology.
Volkswagen AG’s Audi has said it delayed bringing out a hands-free system that relies on eye-tracking technology as it awaits regulations in the U.S. and Germany to remove risk associated with its use.