Tesla FSD gets worse at driving (arstechnica.com)

🤖 AI Summary
The National Highway Traffic Safety Administration’s Office of Defects Investigation has opened a preliminary probe into Tesla’s Full Self-Driving (FSD) after dozens of reports that vehicles using the partially automated driving assist ran red lights or otherwise broke traffic laws. The investigation — Tesla’s third by NHTSA this year after probes into remote parking crashes and failures of retractable door handles — cites at least 18 complaints of FSD ignoring red signals, including one observed in a Business Insider test. Reported behaviors include failing to stop, beginning to drive through intersections before lights changed, and providing no warning to drivers. This matters because FSD is marketed as an advanced driver-assist and is delivered via software updates across a fleet, so systemic faults can quickly affect many cars and raise safety, liability, and regulatory concerns. Technically, the complaints suggest failures in perception-to-planning handoffs (detecting and responding to traffic lights) or in human-machine interface alerts that should prompt driver intervention. The probe could lead to mandated fixes, fines, or restrictions on software rollouts, and will intensify scrutiny over Tesla’s naming, validation processes, and driver monitoring practices — issues that affect the broader autonomous-driving community and how regulators oversee fleet-learned driving systems.
Loading comments...
loading comments...