The U.S. National Highway Traffic Safety Administration (NHTSA) has opened a preliminary investigation into about 2,000 self-driving vehicles operated by Waymo, Alphabet’s autonomous driving division, following reports that one of the company’s robotaxis failed to obey traffic laws near a stopped school bus in Georgia.
The probe, announced Monday, is the latest in a growing series of U.S. government reviews into the safety of driverless vehicles and how they interact with pedestrians, cyclists, and other road users. Regulators are increasingly concerned about how these systems respond in real-world scenarios that involve vulnerable groups, particularly children.
According to NHTSA, the investigation stems from a recent media report that aired video footage of a Waymo vehicle maneuvering around a school bus with its red lights flashing and stop arm deployed as students were disembarking. The report said the autonomous vehicle initially came to a stop but then pulled forward, passing the bus’s extended stop arm before continuing down the road.
Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026): big discounts for early bird.
Tekedia AI in Business Masterclass opens registrations.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab: From Technical Design to Deployment (next edition begins Jan 24 2026).
“Based on NHTSA’s engagement with Waymo on this incident and the accumulation of operational miles, the likelihood of other prior similar incidents is high,” the agency said in its filing.
Waymo’s automated driving system has logged more than 100 million miles in real-world testing as of July, and the company says its fleet is now adding roughly two million miles every week.
The vehicle involved in the Georgia incident was operating without a human safety driver and was equipped with Waymo’s fifth-generation Automated Driving System, according to NHTSA.
A Waymo spokesperson said in a statement that the company has “already developed and implemented improvements related to stopping for school buses and will land additional software updates in our next software release.” The spokesperson added that “driving safely around children has always been one of Waymo’s highest priorities,” and explained that in this particular case, the vehicle “approached the school bus from an angle where the flashing lights and stop sign were not visible and drove slowly around the front of the bus before driving past it, keeping a safe distance from children.”
Waymo’s autonomous fleet currently includes more than 1,500 vehicles operating in several major U.S. cities, including Phoenix, Los Angeles, San Francisco, and Austin. The company’s robotaxis are deployed for commercial ride-hailing services in some of those markets, where they operate without safety drivers.
This is not the first time the Mountain View–based company has faced scrutiny from U.S. regulators. In July, NHTSA closed a 14-month investigation into 22 reports involving Waymo vehicles that had exhibited “unexpected behavior” or collided with “clearly visible objects that a competent driver would be expected to avoid.” That earlier probe ended after Waymo conducted two recalls of its software to address issues identified by the agency.
The new investigation signals that federal regulators remain unconvinced that existing safeguards are sufficient to prevent potential hazards involving autonomous vehicles, especially around sensitive scenarios like school zones. NHTSA officials have said they are increasingly focused on how automated systems interpret traffic signals, recognize pedestrians, and respond to emergency vehicles.
Self-driving technology companies, including Waymo, Cruise, and Tesla, have all faced regulatory challenges as the industry races to commercialize fully autonomous operations. In October 2023, Cruise — General Motors’ driverless car division — suspended operations nationwide after one of its robotaxis struck and dragged a pedestrian in San Francisco. The incident led to California regulators revoking its operating permit, underscoring how fragile public trust in autonomous vehicles remains.
Waymo, however, has continued to expand its operations, touting a strong safety record and extensive real-world testing data. Company executives have repeatedly said that their vehicles are designed to outperform human drivers in terms of accident rates and response times. The company has also emphasized that it works closely with regulators to ensure compliance with safety standards.
The outcome of NHTSA’s new investigation could determine whether Waymo faces another round of recalls or operational restrictions. For now, the agency’s preliminary probe will assess whether the company’s vehicles consistently comply with traffic safety laws, particularly in situations involving school buses and pedestrian crossings.
While the Georgia incident did not result in injuries, safety analysts say the case raises serious concerns about how autonomous vehicles interpret nuanced road situations that rely on human judgment.



