Home Community Insights U.S. Regulators Open Federal Investigation Into Tesla’s Full Self-Driving System After Dozens of Reported Safety Incidents

U.S. Regulators Open Federal Investigation Into Tesla’s Full Self-Driving System After Dozens of Reported Safety Incidents

U.S. Regulators Open Federal Investigation Into Tesla’s Full Self-Driving System After Dozens of Reported Safety Incidents

The U.S. National Highway Traffic Safety Administration (NHTSA) has opened a sweeping federal investigation into potential safety defects in Tesla’s Full Self-Driving (FSD) system, also known as Full Self-Driving (Supervised), after a growing number of crashes and traffic violations allegedly linked to the technology.

According to the NHTSA’s Office of Defects Investigation (ODI), the probe follows 44 reported incidents in which Tesla vehicles using FSD were said to have run red lights, veered into oncoming traffic, or committed other unsafe maneuvers that led to collisions, some resulting in injuries. The agency’s preliminary evaluation, made public Thursday, will cover an estimated 2.88 million Tesla vehicles equipped with either FSD (Supervised) or FSD (Beta).

The NHTSA said it aims to determine whether Tesla’s system provided “adequate warning or sufficient time for the driver to respond to unexpected behavior” and whether its cameras and sensors are capable of correctly detecting lane markings, wrong-way signs, and traffic lights. The review will also assess how the FSD system communicates warnings to drivers and whether those alerts are sufficient for safe supervision.

Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026): big discounts for early bird

Tekedia AI in Business Masterclass opens registrations.

Join Tekedia Capital Syndicate and co-invest in great global startups.

Register for Tekedia AI Lab: From Technical Design to Deployment (next edition begins Jan 24 2026).

Even with FSD engaged, Tesla requires a human driver to remain alert and ready to take control at any time — a stipulation that federal regulators say is essential to avoid fatal errors. However, drivers have repeatedly reported instances in which FSD appeared to misread intersections, misjudge other vehicles’ movements, or fail to recognize road hazards.

Tesla has not commented on the new probe, though the company released an updated FSD version 14.1 this week, part of ongoing refinements to the semi-automated system that remains central to CEO Elon Musk’s vision for the future of driving. Musk has long promised that Tesla’s vehicles would eventually operate as fully autonomous “robotaxis”, capable of generating revenue for owners while they sleep or travel.

That vision, however, remains elusive. Tesla has since informed customers that full autonomy will require both software and hardware upgrades, undercutting earlier claims that existing cars could achieve full self-driving through software updates alone.

The new investigation comes at a time when the industry remains divided over the best path to achieving safe autonomous driving — the use of cameras versus LiDAR (Light Detection and Ranging).

Tesla’s approach relies entirely on camera-based vision and neural networks, with Musk frequently dismissing LiDAR as “a fool’s errand” and “a crutch” for developers who, in his view, do not trust computer vision enough. He has argued that human drivers rely solely on visual perception, not laser sensors, so a camera-based AI should eventually be capable of outperforming people on the road.

However, many autonomous driving firms, including Waymo (Alphabet’s self-driving unit) and Cruise, continue to use LiDAR — a laser-based system that builds detailed 3D maps of the vehicle’s surroundings — in combination with radar and cameras. These companies argue that LiDAR provides superior depth perception, distance accuracy, and reliability in poor lighting or weather conditions, areas where camera-only systems have struggled.

Notably, while both technologies have faced challenges, LiDAR-based autonomous vehicles have been involved in far fewer reported safety incidents than Tesla’s camera-only models. Analysts say this reinforces the case for using multi-sensor systems until camera-based AI can consistently match human perception under all driving conditions.

Tesla’s latest regulatory trouble also coincides with continued budget cuts at NHTSA, part of a broader downsizing ordered by President Donald Trump in February to reduce the federal workforce. The cuts reportedly affected several key divisions, including the autonomous vehicle oversight unit, limiting the agency’s resources for comprehensive field investigations.

Despite that, the launch of this new FSD probe signals that regulators remain vigilant as Tesla continues to push the boundaries of driver-assistance technology. The findings could have significant implications for Musk’s long-promised robotaxi rollout and for the broader debate over whether full self-driving cars should rely on cameras, LiDAR, or a hybrid of both.

If the agency determines that FSD poses systemic safety risks, Tesla could face mandatory recalls, new operational restrictions, or further scrutiny of its marketing claims — especially its use of the term “Full Self-Driving,” which many regulators and safety advocates argue is misleading.

For now, the probe adds to growing questions over whether Tesla’s vision-only strategy can safely deliver on its promise of autonomy — or whether, in the race to eliminate human error, it has introduced new dangers of its own.

No posts to display

Post Comment

Please enter your comment!
Please enter your name here