Explore
Settings

Settings

×

Reading Mode

Adjust the reading mode to suit your reading needs.

Font Size

Fix the font size to suit your reading preferences

Language

Select the language of your choice. NewsX reports are available in 11 global languages.
we-woman
Advertisement

What Is Tesla’s Full Self-Driving System? Autopilot System Under Investigation In US

The U.S. government's road safety agency is examining Tesla's Full Self-Driving system following reports of accidents occurring in low-visibility conditions, including one incident that resulted in a pedestrian's death.

What Is Tesla’s Full Self-Driving System? Autopilot System Under Investigation In US

The U.S. government’s road safety agency is examining Tesla’s Full Self-Driving system following reports of accidents occurring in low-visibility conditions, including one incident that resulted in a pedestrian’s death.

The National Highway Traffic Safety Administration (NHTSA) stated in documents that it initiated the investigation on Thursday after Tesla disclosed four crashes linked to conditions like sun glare, fog, and airborne dust.

In addition to the fatality, the agency noted that another incident resulted in an injury.

2.4 million Tesla vehicles under investigation

Investigators will assess whether the Full Self-Driving system can adequately detect and respond to reduced visibility on roadways, as well as the factors that contributed to these crashes.

The investigation encompasses approximately 2.4 million Tesla vehicles manufactured between 2016 and 2024.

Tesla has consistently maintained that its system does not drive autonomously and that human drivers must always be prepared to intervene.

Recently, Tesla showcased a fully autonomous robotaxi without a steering wheel or pedals at an event in a Hollywood studio. Musk indicated that the company aims to have autonomous Models Y and 3 operating without human drivers by next year, with robotaxis lacking steering wheels set to launch in California and Texas in 2026.

Full Self-Driving under investigation

It remains uncertain how this investigation will affect Tesla’s self-driving goals. NHTSA approval would be necessary for any robotaxi that lacks pedals or a steering wheel, and it is unlikely that such approval would be granted during the ongoing investigation. If Tesla attempts to deploy autonomous features in its current models, that matter will likely fall under state regulations, as there are no specific federal regulations governing autonomous vehicles beyond general safety standards.

NHTSA also plans to investigate whether other similar incidents involving Full Self-Driving have occurred in low-visibility situations. The agency will request information from Tesla regarding whether any updates impacted the system’s performance in such conditions.

The documents indicated that the review would particularly focus on the timing, intent, and capabilities of any updates, along with Tesla’s evaluation of their safety implications.

Tesla has recalled its Full Self-Driving system twice

Tesla reported the four crashes to NHTSA in compliance with an order applicable to all automakers. According to an agency database, a pedestrian died in Rimrock, Arizona, in November 2023, after being struck by a 2021 Tesla Model Y. Rimrock is located about 100 miles north of Phoenix. Efforts were made to gather information about the incident from local and state agencies.

Under pressure from NHTSA, Tesla has recalled its Full Self-Driving system twice. In July, the agency sought information from law enforcement and Tesla following an incident where a vehicle using the system collided with and killed a motorcyclist near Seattle.

The recalls were initiated due to the system’s programming that allowed it to ignore stop signs at low speeds and violate other traffic laws. Both issues were scheduled to be addressed through online software updates.

Autopilot crash investigation

These recalls came after a three-year investigation into Tesla’s less advanced Autopilot system, which was found to have crashed into stationary emergency and other vehicles on highways, many of which had their warning lights flashing.

That investigation concluded in April, after NHTSA pressured Tesla to implement recalls to enhance a deficient system designed to ensure drivers were attentive. Shortly after the recall, NHTSA began investigating whether it effectively addressed the issue.

The Autopilot crash investigation was initiated in 2021, following reports that 11 Teslas using Autopilot had struck parked emergency vehicles. In its report on the investigation’s closure, NHTSA revealed that it identified 467 crashes linked to Autopilot, resulting in 54 injuries and 14 fatalities. While Autopilot is essentially a sophisticated version of cruise control, Musk has promoted “Full Self-Driving” as capable of operating without human intervention.

Evaluating the capabilities of Full Self-Driving

The investigation launched on Thursday represents a new approach for NHTSA, which had previously regarded Tesla’s systems as driver assistance rather than fully autonomous driving. This new inquiry focuses on evaluating the capabilities of “Full Self-Driving,” moving beyond simply ensuring driver attentiveness.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, commented that the previous Autopilot investigation did not address why Teslas failed to recognize and stop for emergency vehicles.

He noted that earlier assessments primarily placed responsibility on the driver rather than the vehicle itself. In this investigation, the agency is indicating that these systems may not be adequately equipped to detect safety hazards, regardless of driver attentiveness.

Read More: Donald Trump’s Health: Here Is What We Know So Far

mail logo

Subscribe to receive the day's headlines from NewsX straight in your inbox