🤖 AI Summary
U.S. regulators have opened a preliminary probe into roughly 2,000 Waymo robotaxis after a media-released video showed one of the company’s vehicles failing to remain stopped for a school bus with red lights flashing and the stop arm deployed. NHTSA said the Waymo vehicle initially stopped, then maneuvered around the bus while students were disembarking; it was running Waymo’s fifth‑generation automated driving system (ADS) and had no human safety driver aboard. The agency flagged the investigation because Waymo’s fleet has logged over 100 million miles (about 2 million per week), increasing the likelihood of prior similar incidents. Waymo says it has already implemented improvements for school-bus behavior and will push further software updates.
The case matters because school-bus interactions are high-stakes edge cases that test perception, occlusion handling, rule-following logic and conservative fail-safe behavior in autonomous systems. NHTSA’s action underscores growing regulatory scrutiny of how self-driving systems interact with pedestrians, cyclists and vulnerable road users; it follows a recently closed 14-month probe that led to two recalls after reports of unexpected maneuvers and minor collisions. Technically, the incident highlights challenges in detecting stop-arms and flashing lights from oblique approach angles, decision-making when signage is partially occluded, and the need for verified software patches or operational constraints to ensure legal and safe behavior around children.
Loading comments...
login to comment
loading comments...
no comments yet