Elon Musk’s heavily touted Tesla Full-Self Driving (FSD) feature has been the subject of scrutiny as recent videos showcase the system’s struggles in navigating complex and unexpected road situations.
Futurism reports that Tesla’s Full-Self Driving (FSD) feature, despite its ambitious name, has been facing challenges in delivering on its promise of autonomous driving. Recent videos shared by Tesla enthusiasts and critics alike have highlighted instances where the system falters when confronted with real-world scenarios that deviate from ideal driving conditions.
In a video posted by Scott Woods, a Tesla fan and personality behind the Whole Mars Catalog account on X (formerly Twitter), a Tesla equipped with FSD can be seen navigating the roads between Sausalito and San Francisco. While the vehicle manages to handle most of the journey without human intervention, it encounters a perplexing situation when confronted by a construction worker wearing bright orange safety gear and holding a stop sign.
The Tesla, seemingly unsure of how to proceed, begins to steer towards the left as if attempting to circumvent the road worker, a clear violation of traffic laws. The vehicle remains stopped in the middle of the road, straddling the yellow median line, as the construction worker repeatedly gestures for the Tesla to move off the road. It is only when another vehicle approaches from the opposite direction that Woods intervenes, manually steering the Tesla to the right to allow the other vehicle to pass.
This incident has drawn criticism from Tesla skeptics, with entrepreneur and Tesla critic Dan O’Dowd sharing the video on X, directly addressing Tesla CEO Elon Musk and asserting that “your defective software should be banned from public roads.”
Watch this @Tesla #FSD (Supervised) v12.4.1 ignore a construction worker’s stop sign and attempt to drive around the construction site, forcing @WholeMarsBlog to intervene. @ElonMusk your defective software should be banned from public roads.https://t.co/l2hS0Liv7V
— Dan O'Dowd (@RealDanODowd) June 10, 2024
The video adds to a series of recent incidents that have cast doubt on the reliability of Tesla’s self-driving technology. In another viral video, a Tesla with FSD activated can be seen veering into oncoming traffic just as the driver praises the software’s capabilities to a friend. A separate incident shows a Tesla on driver assist heading straight towards moving train cars, requiring the driver to take control at the last moment to avoid a collision.
Tesla Full Self-Driving 12.3.4 decided it wanted to cut in line and then heads straight for a pole. This was a critical safety takeover. pic.twitter.com/13m5FISbHs
— Miss Jilianne (@MissJilianne) April 18, 2024
A Tesla on Full Self-Driving blows through a stop sign at 35mph and nearly collides with two cars. The kicker is that this was during a livestream debate-demo-drive between FSD fan @GerberKawasaki and FSD skeptic @RealDanODowd.
— Taylor Ogan (@TaylorOgan) June 23, 2023
This should go without saying, but for an automated… pic.twitter.com/KVhWjx1It2
This is the scariest near-crash on Tesla Full Self-Driving Beta I've seen. The Tesla tries to turn directly in front of a train (light rail) in Denver. pic.twitter.com/eZE92qGD3t
— Taylor Ogan (@TaylorOgan) July 1, 2022
These incidents underscore a common theme: while self-driving cars can perform well under predictable road conditions, they often struggle when faced with unusual or visually complex situations. Construction sites, train crossings, and other atypical scenarios can cause the system to falter, potentially putting lives at risk.
Despite ongoing investigations by federal authorities and concerns raised by critics, Tesla’s FSD-equipped vehicles continue to operate on public roads. Non-Tesla drivers, pedestrians, and road workers unwittingly find themselves as participants in Elon Musk’s uncontrolled experiment, the outcome of which remains uncertain.
Read more at Futurism here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.