Waymo’s driverless taxis are facing another rules-of-the-road test with zero margin for error. According to Reuters, Alphabet is launching a voluntary software recall after its cars in Texas illegally drove past school buses displaying stop signals at least 19 times since the start of the academic year. In the United States, this is among the most tightly regulated road scenarios: a driver must stop and wait until children have crossed. For autonomous systems, rare but high-stakes moments like this are the real benchmark of maturity.

The NHTSA opened its review back in October and now has ordered Waymo to provide detailed responses about the Texas incidents by January 20. The company concedes that software may have been at fault: in some cases the vehicle slowed or came to a stop, then resumed moving while the bus was still stationary. Waymo says updates already rolled out have significantly improved behavior, and the recall will essentially push and lock the required software version across the fleet.

The most unsettling piece comes from the Austin school district. In a letter published by NHTSA, Austin ISD said the violations continued even after the updates and asked Waymo to pause operations near schools during pickup and drop-off hours. The district also claimed there was an instance when a robotaxi passed a bus shortly after a child had crossed the street. Waymo, for its part, declined to suspend service and maintained that its system in that area already performs better than a human driver. Progress matters, but around school buses the standard is perfection—and that is where confidence is either earned or lost.