Did Elon Musk Sacrifice Safety in his Push for self-driving cars?
Tesla, Elon Musk’s electric car firm, was created on the idea that it represented the future of driving.
The Autopilot is a set of features that could steer, brake, and accelerate the company’s sleek electric vehicles on highways. Mr. Musk frequently stated that true self-driving cars were on the horizon, that the day would come when a Tesla could drive itself, and that the technology would be delivered to drivers via software upgrades.
While working on self-driving cars, Mr. Musk claimed that autonomy could be attained merely via the use of cameras that tracked their surroundings. Many Tesla engineers, though, questioned whether relying only on cameras without the assistance of additional sensors was safe and whether Mr. Musk was promising drivers too much about Autopilot’s capabilities.
Tesla customers are accusing the business of misrepresenting Autopilot and a suite of sister services known as Full Self Driving, or F.S.D., and families are suing Tesla for fatal collisions.
According to interviews with 19 individuals who worked on the project over the previous decade, Mr. Musk, as the project’s driving force, pushed it in ways that other manufacturers were hesitant to go. Many of these customers claim that Mr. Musk routinely misleads buyers about the capabilities of the services. Fearing repercussions from Mr. Musk and Tesla, everyone spoke on the condition of anonymity.
Over the course of many weeks, Mr. Musk and a top Tesla lawyer did not reply to several email requests for a response, including a lengthy list of questions. However, Tesla has always said that it is the responsibility of drivers to remain attentive and take control of their vehicles if the autopilot fails.
Since the outset of Tesla’s Autopilot development, there has been a conflict between safety and Mr. Musk’s ambition to advertise Tesla cars as technical wonders.
Mr. Musk has claimed for years that Tesla cars are on the threshold of total autonomy. In 2016, he stated, “The main news is that all Tesla vehicles leaving the factory have all of the hardware necessary for Level 5 autonomy.” Because the Society of Automotive Engineers defines Level 5 as, “full driving automation,”the comment astonished and frightened some project team members.
He has lately said that software will allow automobiles to drive themselves on both city streets and highways. Tesla paperwork, like Autopilot, states that drivers must keep their hands on the wheel and be ready to take control of the vehicle at any moment.
According to regulators, Tesla and Mr. Musk, according to regulators, prompted some customers to misuse it.
“Where I become worried is the terminology that’s used to characterize the vehicle’s capabilities,” said Jennifer Homendy, head of the National
Transportation Safety Board, which has studied autopilot-related incidents and criticized the system’s design. “It has the potential to be really dangerous.”
Hardware selections have also sparked concerns about safety. Some Tesla employees believed that cameras should be paired with radar and other sensors that would operate better in heavy rain, snow, intense sunlight, and other challenging circumstances. For numerous years, Autopilot included radar, and Tesla even focused on building its own radar technology for a period. According to three people familiar with the project, Mr. Musk frequently informed members of the Autopilot team that humans can drive with only two eyes, implying that automobiles should be able to drive with only cameras.
They claimed he regarded this as “returning to first principles,” a term Mr. Musk and others in the tech sector have long used to describe throwing out customary methods and starting again.
Mr. Musk said on Twitter in May that Tesla would no longer include radar in new vehicles. He said that the corporation had looked into the safety implications of not employing radar, but he didn’t clarify.
Some have praised Mr. Musk, saying that a certain degree of sacrifice and risk was warranted in his quest to achieve mass manufacturing and, in turn, transform the car industry.
Related More: CREATING A SELF-DRIVING TOY CAR
However, even Mr. Musk has recently expressed reservations about Tesla’s technology. After frequently touting full self-driving as a system on the verge of full autonomy in speeches, interviews, and on social media, Mr. Musk termed it “not fantastic” in August. “The crew working on it is rallying to improve as quickly as possible,” he added on Twitter.
According to three people acquainted with the project’s origins, Tesla began developing Autopilot more than seven years ago in order to fulfill new European safety rules that demanded technologies like automated braking.
The concept was first dubbed “enhanced driving assistance,” but the corporation was soon looking for a new label. According to these three sources, executives led by Mr. Musk chose “Autopilot” despite some Tesla engineers’ objections to the term as misleading and preferring “Copilot” and other possibilities.
The term was inspired by aviation technologies that let planes fly autonomously under perfect conditions with minimal human intervention.
Tesla stated in its initial launch of Autopilot in October 2014 that the technology would automatically stop and maintain the car in a lane, but that “the driver is still authorized for, and ultimately in control of, the automobile.” According to the report, self-driving automobiles were “years away.”
Other businesses studying autonomous driving, such as Google, Toyota, and Nissan, were less upbeat in their public statements.
Joshua Brown, a Model S owner, was murdered in Florida in May 2016, roughly six months after Mr. Musk’s words appeared in Fortune. The autopilot failed to notice a tractor-trailer passing in front of him. His automobile was equipped with radar and a camera.
Mr. Musk met with the autopilot team for a brief discussion and addressed the accident. According to two people who attended the meeting, he did not go into detail about what went wrong, but he did tell the staff that the corporation needed to work to guarantee that its cars would not collide with anything.
Tesla then claimed that Autopilot’s camera couldn’t see the difference between the white truck and the bright sky during the incident. Tesla has never said openly why the radar failed to prevent the crash.
Engineers who worked on the technology agree that these services are still far from achieving the full autonomy promised by Mr. Musk in public speeches. During an earnings call in January 2021, he claimed, “I’m quite optimistic the automobile will drive itself for the reliability of a human this year.” “This is an important milestone.”