Elon Musk built his electric car company, Tesla, around the promise that it represented the future of driving — a phrase emblazoned on the automaker’s website.
Much of that promise was centered on Autopilot, a system of features that could steer, brake and accelerate the company’s sleek electric vehicles on highways. Over and over, Mr. Musk declared that truly autonomous driving was nearly at hand — the day when a Tesla could drive itself — and that the capability would be whisked to drivers over the air in software updates.
Unlike technologists at almost every other company working on self-driving vehicles, Mr. Musk insisted that autonomy could be achieved solely with cameras tracking their surroundings. But many Tesla engineers questioned whether it was safe enough to rely on cameras without the benefit of other sensing devices — and whether Mr. Musk was promising drivers too much about Autopilot’s capabilities.
Now those questions are at the heart of an investigation by the National Highway Traffic Safety Administration after at least 12 accidents in which Teslas using Autopilot drove into parked fire trucks, police cars and other emergency vehicles, killing one person and injuring 17 others.
Families are suing Tesla over fatal crashes, and Tesla customers are suing the company for misrepresenting Autopilot and a set of sister services called Full Self Driving, or F.S.D.
As the guiding force behind Autopilot, Mr. Musk pushed it in directions other automakers were unwilling to take this kind of technology, interviews with 19 people who worked on the project over the last decade show. Mr. Musk repeatedly misled buyers about the services’ abilities, many of those people say. All spoke on the condition of anonymity, fearing retaliation from Mr. Musk and Tesla.
Mr. Musk and a top Tesla lawyer did not respond to multiple email requests for comment for this article over several weeks, including a detailed list of questions. But the company has consistently said that the onus is on drivers to stay alert and take control of their cars should Autopilot malfunction.
Since the start of Tesla’s work on Autopilot, there has been a tension between safety and Mr. Musk’s desire to market Tesla cars as technological marvels.
For years, Mr. Musk has said Tesla cars were on the verge of complete autonomy. “The basic news is that all Tesla vehicles leaving the factory have all the hardware necessary for Level 5 autonomy,” he declared in 2016. The statement surprised and concerned some working on the project, since the Society of Automotive Engineers defines Level 5 as full driving automation.
More recently, he has said that new software — currently part of a beta test by a limited number of Tesla owners who have bought the F.S.D. package — will allow cars to drive themselves on city streets as well as highways. But as with Autopilot, Tesla documentation says drivers must keep their hands on the wheel, ready to take control of the car at any time.
Regulators have warned that Tesla and Mr. Musk have exaggerated the sophistication of Autopilot, encouraging some people to misuse it.
“Where I get concerned is the language that’s used to describe the capabilities of the vehicle,” said Jennifer Homendy, chairwoman of the National Transportation Safety Board, which has investigated accidents involving Autopilot and criticized the system’s design. “It can be very dangerous.”
In addition, some who have long worked on autonomous vehicles for other companies — as well as seven former members of the Autopilot team — have questioned Tesla’s practice of constant modifications to Autopilot and F.S.D., pushed out to drivers through software updates, saying it can be hazardous because buyers are never quite sure what the system can and cannot do.
Hardware choices have also raised safety questions. Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions. For several years, Autopilot incorporated radar, and for a time Tesla worked on developing its own radar technology. But three people who worked on the project said Mr. Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.
They said he saw this as “returning to first principles” — a term Mr. Musk and others in the technology industry have long used to refer to sweeping aside standard practices and rethinking problems from scratch. In May of this year, Mr. Musk said on Twitter that Tesla was no longer putting radar on new cars. He said the company had tested the safety implications of not using radar but provided no details.
Some people have applauded Mr. Musk, saying that a certain amount of compromise and risk was justified as he strove to reach mass production and ultimately change the automobile industry.
But recently, even Mr. Musk has expressed some doubts about Tesla’s technology. After repeatedly describing Full Self Driving in speeches, in interviews and on social media as a system on the verge of full autonomy, Mr. Musk in August called it “not great.” The team working on it, he said on Twitter, “is rallying to improve as fast as possible.”
Read More: https://www.nytimes.com/2021/12/06/technology/tesla-autopilot-elon-musk.html