
Every driver knows the rule: when red lights flash and the stop arm extends, you stop. For Waymo’s robotaxis, that universal signal became a blind spot. Once hailed as “safer than human drivers,” Waymo’s fleet violated this fundamental law 26 times across two states, putting students at risk — including one terrifying instance “only moments after a student crossed.”
Now a nationwide software recall is underway, and a federal investigators has been launched. If AI can’t follow the most important rule of the road, is it truly ready to drive?
The Recall That Shocked an Industry

Waymo announced a nationwide software recall in the early days of the December after its autonomous vehicles illegally passed school buses 26 times. Incidents occurred in Austin (20) and Atlanta (6) during the 2025-2026 school year.
The update is over-the-air, keeping vehicles operational. Yet it raises a troubling question: how could such fundamental traffic laws be repeatedly ignored?
One Child, One Close Call

“As of December 1, 2025, Waymo received its 20th citation since the beginning of the school year. This is after the company said it had fixed the issue through software updates that were implemented on November 17,” said JJ Maldonado, Austin ISD Communications Specialist, on December 2.
One incident saw a Waymo car pass a bus just moments after a student crossed, leaving the child inches from disaster. How could safety systems allow such close calls?
The Scale of Waymo’s Operation

Waymo operates over 2,500 autonomous vehicles across San Francisco, Los Angeles, Austin, Phoenix, and Atlanta, logging approximately 2 million miles per week. NPR noted that cumulative miles surpassed 100 million as of July 2025.
With hundreds of thousands of rides weekly, the company is the world’s largest commercial AV operator. However, large-scale operations expose rare edge cases, including children crossing streets.
The “Safer Than Humans” Claim

Waymo reports 91% fewer crashes with serious injuries and 92% fewer with pedestrian injuries compared to humans, CBS News confirmed on December 5, 2025. These statistics underpinned claims that AVs are ready for mass deployment.
Yet, repeated school bus violations highlight a gap: millions of miles of safety do not guarantee compliance with core traffic laws that protect children.
A Pattern of Recurring Software Problems

The school bus recall is Waymo’s fourth software recall since February 2024, CNBC reported on May 14, 2025. Previous recalls involved 444 vehicles, 672 vehicles, and 1,212 vehicles for issues ranging from obstacle detection to gate failures.
The growing frequency and scale, from 444 to 2,500 vehicles, indicate systemic quality assurance challenges. Could this pattern predict further road risks?
The Software Fix That Wasn’t

Waymo deployed a software update on November 17, 2025, claiming it resolved school bus detection issues, according to TechCrunch on December 4, 2025. The company promised performance “better than human drivers” without sharing supporting data.
Austin ISD documented five more violations within weeks. The problem persisted, suggesting a deeper root cause than the November software fix addressed.
How the Software Failed

Reuters reported on December 5, that vehicles initially slowed or stopped for buses but then proceeded incorrectly. Sensors detected buses and lights, yet the AI determined it was safe to continue.
This wasn’t blindness to school buses—it was a failure in decision-making. The vehicles recognized the hazard but misjudged when stopping requirements ended, a critical flaw.
The Student Population at Risk

Students in Waymo’s operating cities number roughly 1.5 to 3 million, with Austin ISD running 1,000 buses for 73,000+ children and Atlanta running 800 buses for 50,000+. Peak bus hours coincide with Waymo’s heaviest operations.
Daily interactions with thousands of school buses place countless children at potential risk. How many near-misses go unreported?
A Universal Law Violated

All 50 U.S. states require vehicles to stop for buses displaying flashing red lights and deployed stop arms, CBS News confirmed on December 5. These laws are a universal safety standard, essential for protecting children.
Waymo’s AI failed repeatedly, highlighting a gap between statistical safety claims and adherence to basic traffic rules. Near-tragedies were narrowly avoided.
Austin ISD’s Formal Demand

On November 20, 2025, Austin ISD formally requested that Waymo cease operations during student transport hours: 5:20-9:30 a.m. and 3:00-7:00 p.m., as reported by TechCrunch on December 4. Fifteen violations had already occurred by late November.
Waymo refused, relying on software fixes while continuing full operations. The school district’s safety request clashed with Waymo’s operational priorities.
The Business Case for Refusing Restrictions

Morning and evening rush hours coincide with school bus operations and Waymo’s highest revenue periods. Halting service during these hours would result in a 50-60% reduction in daily Austin revenue.
Applying such restrictions elsewhere could set a precedent. Waymo faced a business choice: operational continuity or immediate child safety controls.
Federal Investigators Step In

The NHTSA opened a Preliminary Evaluation (PE25013) in October 2025 after media reports, Reuters confirmed on October 20. A December 3 Information Request Letter demanded Waymo’s detailed technical analysis by January 20, 2026.
Given over 100 million operational miles, the NHTSA warned that the 26 violations may be just a fraction of the actual incidents, raising scrutiny of systemic compliance issues.
The Expansion Paradox

Waymo announced its expansion to Philadelphia and plans for 24 cities, as reported by NPR on December 6, despite unresolved violations. Philadelphia operates 500 buses, serving over 120,000 students.
Expansion amid federal investigation raises regulatory questions. Leadership bets that the first-to-market advantage outweighs reputational risk while the safety crisis remains unresolved.
The Legal Gray Zone

Traffic citations traditionally target drivers. Austin ISD issued 20 citations to Waymo as a corporate entity. Texas fines range from $500 to $1,250 per offense, totaling $10,000 to $25,000 for Austin violations.
For Alphabet-backed Waymo, fines are minor. The legal framework struggles to deter AV violations, creating a gray zone for accountability in autonomous transportation.
What Makes School Buses Different

School bus laws exist to protect children, providing flashing lights, stop arms, and crossing guards as additional safety measures. Human drivers internalize child vulnerability over the years.
Waymo’s AI treated buses as obstacles, lacking human judgment or protective instinct. Repeated violations demonstrate why automation cannot fully replicate contextual understanding, particularly in situations involving vulnerable pedestrians.
Fleet Growth During the Crisis

Waymo’s fleet grew from 1,500 vehicles in May 2025 to 2,500 by November—a 67% increase. The expansion coincided with an accumulation of school bus violations, according to The Driverless Digest and November estimates.
Adding 1,000 vehicles while citations rose suggests that leadership treated the problem as manageable with software updates, rather than as an immediate operational hazard.
What January 20, 2026, Really Means

By January 20, 2026, Waymo must provide technical analysis, validation testing, and plans to prevent similar failures. NHTSA’s review could result in fines, consent orders, or broader regulatory actions.
The outcome will set a precedent for AV operators nationwide. Other companies, such as Tesla, Cruise, and Aurora, are closely monitoring developments to gauge acceptable safety standards.
The Broader Industry at Stake

Waymo’s failures influence competitor timelines. Tesla’s camera-only robotaxi faces heightened scrutiny. Cruise, resuming operations post-2023 pedestrian incident, sees Waymo’s problem as proof that even top AVs struggle with school zones.
Cities and insurers must now consider bus interactions in their permitting and liability processes. One company’s errors ripple across the autonomous vehicle ecosystem.
The Reckoning Ahead

Waymo’s value proposition relies on being safer than humans. Safety statistics across 100 million miles do not guarantee compliance with foundational child-protection laws.
Federal investigations, school district demands, and industry scrutiny converge. By January 2026, Waymo must prove that AVs not only crash less but also follow the rules designed to keep children safe.
SOURCES:
National Highway Traffic Safety Administration (NHTSA) Preliminary Evaluation PE25013
Austin Independent School District official correspondence and citations, November-December 2025
Reuters reporting on Waymo school bus violations and NHTSA investigation
NPR Waymo school bus recall coverage, December 6, 2025
CBS News NHTSA expanded investigation into Waymo, December 5, 2025
TechCrunch federal inquiry and school bus incident details, December 4, 2025
The Driverless Digest Waymo fleet size and growth statistics, August 28, 2025