Tesla investigated over self-driving cars on wrong side of road (www.bbc.com)

🤖 AI Summary
The US National Highway Traffic Safety Administration (NHTSA) has opened a preliminary evaluation of Tesla’s Full Self‑Driving (FSD) “Supervised” mode after at least 58 reports that vehicles drove on the wrong side of the road, failed to stop for red lights, or entered the opposite lane while turning. The probe covers an estimated 2.9 million Teslas equipped with the optional FSD package and will assess the scope, frequency and potential safety consequences of these behaviors. NHTSA’s filing cites six crashes tied to vehicles proceeding through intersections while a light was still red (four resulting in injuries), and notes some incidents gave drivers little or no time to intervene. For the AI/ML community, the investigation highlights critical human‑machine interaction and perception/planning challenges in real‑world driving: FSD’s ability to detect and classify traffic signals, predict other road users, and safely execute yields/turns appears inconsistent in some contexts, producing sudden, low‑warning takeover events. The case underscores regulatory risk and liability exposure for large‑scale deployments, the need for rigorous edge‑case testing, clearer handover cues for human supervisors, and faster patching and monitoring pipelines. Outcomes could influence industry standards for validation, reporting, and deployment of advanced driver assistance systems across the automotive and AI ecosystems.
Loading comments...
loading comments...