Tesla ordered NHTSA to provide data on ‘Elon mode’ for Autopilot

Tesla received a special order from federal auto safety regulators requiring the company to provide detailed data on its driver assistance and driver monitoring systems, as well as a once-secret configuration for them known as the name “Elon mode”.
Generally, when a You’re here The driver uses the company’s driver assistance systems – which are marketed as Autopilot, Full Self-Driving or FSD Beta options – a visual symbol flashes on the car’s touchscreen to prompt the driver to engage the steering wheel. If the driver leaves the steering wheel unattended for too long, the “nagging” turns into a beeping sound. If the driver still does not take the wheel at that time, the vehicle may disable the use of its advanced driver assistance functions for the remainder of the trip or longer.
As CNBC previously reported, with the “Elon mode” setup enabled, Tesla can allow a driver to use the company’s Autopilot, FSD, or FSD Beta systems without the so-called “nag.”
The National Highway Traffic Safety Administration sent a letter and special order to Tesla on July 26, requesting details on the use of what apparently includes this special configuration, including the number of cars and drivers Tesla has allowed to operate. use it. The filing was added to the agency’s website on Tuesday and Bloomberg reported it for the first time.
In the letter and special order, the agency’s acting chief counsel, John Donaldson, wrote:
“NHTSA is concerned about the safety impacts of recent changes to Tesla’s driver monitoring system. This concern is based on available information suggesting that it may be possible for vehicle owners to modify monitoring configurations autopilot drivers to allow the driver to drive the vehicle on autopilot for extended periods without the autopilot prompting the driver to apply torque to the steering wheel.
Tesla was given until August 25 to provide all the information requested by the agency, and responded in time, but they asked and their response was treated confidentially by NHTSA. The company did not immediately respond to CNBC’s request for comment.
Philip Koopman, a car safety researcher and associate professor of computer engineering at Carnegie Mellon University, told CNBC after the order was made public, “It seems NHTSA takes a dim view of cheat codes. that deactivate safety functions such as driver monitoring. I agree. Hidden features that degrade security have no place in production software.
Koopman also noted that NHTSA has yet to complete a series of investigations into crashes in which Tesla Autopilot systems were a possible contributing factor, including a series of “fatal truck crashes while driving” and collisions involving Tesla vehicles hitting stationary first aid vehicles. Acting NHTSA Administrator Ann Carlson has suggested in recent press interviews that a conclusion is near.
For years, Tesla has told regulators, including NHTSA and California’s DMV, that its driver assistance systems, including FSD Beta, are only “Level 2” and do not make their cars self-driving. , although they are marketed under brand names that could muddy the waters. Tesla CEO Elon Musk, who also owns and manages social network X, formerly Twitter, often implies that Tesla vehicles are self-driving.
Over the weekend, Musk live-streamed a test drive in a Tesla equipped with a still-in-development version of the company’s FSD software (v. 12) on the social platform. During this demonstration, Musk streamed using a mobile device he was holding while driving and chatting with his passenger, Ashok Elluswamy, Tesla’s Autopilot software engineering manager.
In the blurry video feed, Musk did not show full detail of his touchscreen or demonstrate that he had his hands on the wheel, ready to take on the driving task at a moment’s notice. Sometimes it was obvious that he had no hands on the yoke.
Its use of Tesla’s systems would likely violate the company’s terms of use for Autopilot, FSD and FSD Beta, according to Greg Lindsay, urban technology researcher at Cornell. He told CNBC the whole ride was like “waving a red flag at NHTSA.”
Tesla’s website warns drivers, in a section titled “Using Autopilot, Enhanced Autopilot, and Fully Self-Driving Capabilities” that “it is your responsibility to stay alert, keep your hands on the wheel at all times and to keep control of your car”.
Bruno Bowden, managing partner of Grep VC, machine learning expert and investor in self-driving vehicle startup Wayve, said the demo showed Tesla was making some improvements to its technology, but still had a long way to go. before it can offer safe and autonomous driving. system.
During the ride, he observed, the Tesla system nearly ran a red light, requiring intervention from Musk who managed to brake in time to avoid danger.
cnbc