Waymo’s Expansion Hits Another Controversy
Autonomous taxis have been one of the most polarizing developments in the automotive world, and Waymo sits right at the center of the debate. The Alphabet-owned company has been rapidly expanding its robotaxi service across major U.S. cities, with plans to bring the technology to even more markets as it pushes the idea that your second car might eventually be replaced by an app summon.
But just as the technology continues its rollout, a troubling new safety concern has surfaced, prompting regulators to pay close attention.
According to a new report from the National Transportation Safety Board (NTSB), investigators are examining an incident in Austin, Texas, where a Waymo driverless taxi allegedly passed a stopped school bus with its lights activated, an action that violates Texas traffic laws. Reuters reports that the January 12 incident occurred while the bus was actively loading students. Officials say another similar event may have happened just two days later involving a special-needs school bus.
The issue comes after Waymo recalled its autonomous vehicles in December following reports that its robotaxis had illegally passed stopped school buses at least 19 times since the school year began. In the most recent case, the vehicle initially stopped but then moved forward after a remote support operator reportedly indicated the bus did not have active signals.
Previous Incidents Raise Questions About the System
This isn’t the first time Waymo has faced scrutiny involving school zones. Earlier this year, federal regulators began investigating an incident where a Waymo autonomous taxi struck a nine-year-old girl in a Santa Monica school zone. According to the company, the vehicle detected the child, who ran out from behind a double-parked SUV, and aggressively applied the brakes, slowing from around 17 mph to under 6 mph before making contact.
Another detail that has raised eyebrows among critics is how Waymo’s vehicles sometimes rely on remote human assistance when they encounter unusual traffic scenarios. While the cars drive themselves most of the time, support operators can step in to help the vehicle interpret tricky situations. Reports indicate that many of these remote support workers are based overseas, including in the Philippines, where they assist with vehicles in real time. In the January school bus incident, the NTSB said the car contacted a remote operator to confirm whether the bus had active warning signals before proceeding.
Source: Waymo
The Lowdown on Autonomous Driving
Waymo’s robotaxi program has long divided public opinion. Some see it as the future of urban mobility, promising fewer crashes and more efficient transportation. Others remain skeptical about whether software can reliably handle the unpredictable chaos of real-world driving, especially in situations involving children, school buses, and busy urban streets.
These incidents highlight a familiar truth about automotive technology:Â Perfection on paper rarely survives contact with reality. Autonomous systems may be packed with sensors, cameras, and machine learning algorithms, but the road is full of edge cases that are difficult to anticipate.
Between pranksters exploiting robotaxi quirks and regulators probing safety incidents, Waymo’s journey toward widespread adoption still faces plenty of bumps. Until autonomous driving proves it can handle every scenario flawlessly, many enthusiasts will likely keep trusting their own right foot and steering wheel instead of a line of code.
Waymo
Â
