The National Highway Traffic Safety Administration (NHTSA) is wrapping up its extensive multiyear inquiry into the safety of Tesla’s driver assistance systems. This development was initially brought to light by Reuters’ David Shepardson on Thursday, who referenced statements from NHTSA’s acting administrator, Ann Carlson. CNBC subsequently verified the information with the federal vehicle safety regulators.
While a representative from NHTSA refrained from revealing additional particulars, they conveyed to CNBC via email that “We confirm the comments to Reuters,” and added that “NHTSA’s Tesla investigations remain open, and the agency generally does not comment on open investigations.” The agency instigated an examination into the safety of Tesla’s driver assistance systems, now referred to as Autopilot, Full Self-Driving (FSD), and FSD Beta options in the U.S., back in 2021. This move came after a series of accidents were observed wherein Tesla drivers, presumably utilizing the company’s driver assistance systems, collided with stationary vehicles belonging to first responders.
Contrary to their labels, Tesla’s driver assistance functionalities do not grant their vehicles autonomy. Unlike the autonomous vehicles utilized by General Motors-owned Cruise or Alphabet’s Waymo, Tesla cars lack the capability to operate as robotaxis. Instead, these vehicles mandate the presence of a human driver behind the wheel, prepared to intervene by steering or braking when necessary. Within restricted scenarios, Tesla’s regular Autopilot and advanced Full Self-Driving systems solely manage braking, steering, and acceleration.
Elon Musk, the CEO of Tesla and the proprietor of the social network X (formerly known as Twitter), frequently insinuates that Tesla vehicles possess autonomous capabilities. An instance of this occurred on July 23rd when a former Tesla employee, previously in charge of the company’s artificial intelligence software engineering, shared a post on the social network discussing ChatGPT. He highlighted how much his parents were amazed when he introduced them to this generative AI tool. Musk responded on X, stating, “Similar reactions occur with Tesla FSD. I tend to overlook the fact that most people on Earth are unaware that cars can navigate on their own.”
Tesla includes explicit guidance in its owners’ manuals for individuals utilizing Autopilot or FSD: “Maintain your hands on the steering wheel at all times and remain attentive to road conditions, the presence of other vehicles, and fellow road users like pedestrians and cyclists. Continuously stay ready to promptly intervene. Disregarding these directives might result in harm, grave injury, or loss of life.”
The vehicles produced by the company are equipped with a driver-monitoring system that utilizes interior cameras and steering wheel sensors to determine if the driver is sufficiently focused on the road and the act of driving. When necessary, this system will audibly alert drivers with a chime and display a message on the car’s touchscreen, urging them to maintain attention and place their hands on the wheel. However, it remains uncertain whether this system possesses the robustness required to guarantee the secure utilization of Tesla’s driver assistance functionalities.
In the past, Tesla has initiated voluntary recalls of its vehicles owing to other concerns related to Autopilot and FSD Beta. The company committed to providing software updates via over-the-air delivery to rectify these issues. However, in July, the regulatory agency mandated Elon Musk’s automotive company to provide more comprehensive data regarding the functionality of their driver assistance systems. This data was requested for assessment as part of the ongoing Autopilot safety inquiries.
The NHTSA regularly releases information about car accidents in the United States involving advanced driver assistance systems like Tesla’s Autopilot, Full Self Driving (FSD), or FSD Beta, categorized as “level 2” according to industry standards set by SAE International. The most recent data from this crash report reveals that there have been a minimum of 26 cases where Tesla vehicles equipped with level 2 systems were involved in fatal accidents between August 1, 2019, and mid-July of the present year. According to the agency’s report, in 23 of these incidents, Tesla’s driver assistance functionalities were active within 30 seconds before the collisions occurred. In the remaining three incidents, it remains uncertain whether these features were engaged.