A harrowing viral video from March 9, 2026, has once again thrust Tesla's Full Self-Driving (FSD) system into a harsh spotlight. The footage, depictin...
Editorial Team
World Of EV

A harrowing viral video from March 9, 2026, has once again thrust Tesla's Full Self-Driving (FSD) system into a harsh spotlight. The footage, depicting a Tesla Model 3 autonomously driving straight through lowered railroad crossing barriers in West Covina, California, not only underscores persistent safety concerns but also casts a long shadow over Tesla's claims of advanced autonomous capabilities. Alarmingly, this incident coincided with the critical deadline for Tesla to submit data to the National Highway Traffic Safety Administration (NHTSA) for an ongoing federal investigation specifically targeting FSD traffic violations, including failures at railroad crossings.
This is far from an isolated event; railroad intersections have proven to be a dangerous blind spot for Tesla's FSD, continuously highlighting the chasm between the system's ambitious name and its real-world performance.
The viral video, shared by a Threads user, showed a Tesla Model 3 in FSD mode proceeding directly through active railroad crossing barriers at approximately 23 mph. The FSD system, operating near West Covina on March 8, 2026, exhibited no discernible attempt to detect the lowered gates or initiate braking, suggesting a profound perception failure. The barriers, positioned at roughly the height of the Tesla's front-facing cameras, raise critical questions about the system's ability to recognize common physical obstacles. While the driver reportedly applied the brakes around the moment of impact, it was too late to prevent the collision.
This incident adds to a troubling history of FSD mishaps at railroad crossings. An extensive investigation by NBC News previously uncovered over 40 reports of FSD failures at such intersections, including a horrifying instance where a Tesla Model 3 on FSD was struck by a train in eastern Pennsylvania after navigating onto the tracks. These recurring failures cement railroad crossings as a known, critical vulnerability for Tesla's advanced driver-assistance system.
The timing of this latest FSD misstep is particularly damning. March 8, 2026, was precisely the deadline for Tesla to furnish crucial crash data to NHTSA as part of a sweeping investigation into FSD traffic violations. NHTSA initiated this probe in October 2025, prompted by at least 58 reported incidents linked to FSD, which included 14 crashes and 23 injuries. By December 2025, the agency's count of documented violations had climbed to 80. Significantly, failures at railroad crossings are a key area of focus within this investigation, alongside other serious infractions such as running red lights and veering into oncoming traffic.
The regulatory pressure on Tesla has been building for years. Senators Ed Markey and Richard Blumenthal previously urged NHTSA to launch a formal investigation into FSD's handling of railroad crossings, citing the severe risks of train-car collisions. Furthermore, the California Department of Motor Vehicles (DMV) has also scrutinised Tesla's marketing of FSD, leading to the company's recent adoption of the "Full Self-Driving (Supervised)" moniker to clarify its Level 2 autonomy status.
For Tesla: This incident delivers a substantial blow to public confidence in FSD's capabilities and the credibility of the "Full Self-Driving" brand, even with the "Supervised" appendage. Occurring amidst a high-stakes NHTSA investigation, it dramatically escalates regulatory pressure, potentially leading to more stringent oversight, prolonged delays in broader FSD deployment, or even mandated recalls if safety risks are deemed unacceptable. It also places a magnifying glass on Tesla's vision-only approach to autonomy, questioning its robustness in handling complex, yet common, infrastructural "edge cases" like railroad crossings.
For the Autonomous Driving Industry: Each public failure of a prominent system like FSD erodes overall public trust in autonomous technology, making it a steeper climb for all players to gain widespread acceptance. It underscores the critical need for meticulous testing, transparent communication, and robust safety redundancies. Competitors adopting more cautious, geo-fenced, or sensor-redundant (e.g., Lidar-equipped) approaches might see their methodologies implicitly validated by Tesla's ongoing struggles with real-world complexities.
For EV Enthusiasts and Prospective Buyers: This serves as a stark, potentially life-threatening, reminder that FSD, despite its name and high price tag, remains a Level 2 driver-assistance system that demands constant driver vigilance and readiness to intervene. The unresolved issues in fundamental safety scenarios like railroad crossings continue to challenge the value proposition of FSD, reinforcing the reality that buyers are, in essence, participating in a sophisticated beta test rather than receiving a fully autonomous vehicle.
The latest FSD incident at a California railroad crossing represents a profound challenge for Tesla, touching upon both technological limitations and intensifying regulatory scrutiny. As the automotive industry steadily progresses towards true autonomy, the imperative for demonstrably safe, reliable, and transparent systems becomes paramount. Tesla must now comprehensively address these persistent safety vulnerabilities to rebuild consumer trust and safeguard its ambitious autonomous future. The path to truly full self-driving will undoubtedly demand far more than just software updates; it requires an unwavering commitment to safety and a realistic understanding of the technology's current limitations.