Year after year the Tesla engineers have failed to get their “self-driving” software to recognize railroad tracks properly and safely. Remember the Palo Alto incident?
So here we are, a decade later with nothing fixed, as if Tesla has been lying about their capability this whole time:
Terrible reporting in the video, as their chosen “experts” both are invested in people buying Tesla and didn’t show any expertise at all. Here, allow me to help. Let’s start with Vancouver news from THREE MONTHS AGO:
Then look at California news from SIX MONTHS AGO:
And then we can go on and on and on, where Tesla itself has been repeatedly proven to fail at train tracks…
- 2024 flyingpenguin: FSD v12.3.6 “…slammed on the brakes in the middle of the train tracks. And just sat there. It would not go…”
- 2023 Reddit: Literally just watched a guy park his Tesla on the train track
- 2022 Washington Post: Tesla tries to drive down some light-rail tracks
- 2022 Daily Mail: Tesla feature attempts to drive down some light-rail tracks, mistaking it for a road
- 2022 Reddit: Tesla thinks the train is a bunch of slow trucks
Train tracks are a known defect in Tesla software, which I’ve watched and confirmed myself. Notably, neither “expert” in the news segment put their Tesla to an actual test with train tracks! Driving the journalist down the same route that a Tesla is driven every day (as described by the owner) is irrelevant to this story about railroad crossings, and it also should be pointed out the “expert” describes his Tesla struggling and “crazy” even at that basic level. Why didn’t this reporter call BS and say “TAKE ME TO A TRAIN TRACK FOR A REAL STORY OR GTFO”.
When low angle sunlight hits shiny metal railroad tracks, for example, this poorly designed car registers “yellow” and “white” lines instead as if it should turn onto them; as if fresh bright lines mean that a newly painted road can just magically appear out of nowhere.
And on that note, who can forget the very stark U.S. government regulator warnings and proposed technology solutions back in 2016 after a human tragically followed their navigation app onto a railroad track causing a fatality?
On Monday, after investigating the crash for almost two years, the National Transportation Safety Board issued a safety recommendation asking technology and delivery companies to add the exact locations of more than 200,000 grade crossings into digital maps and to provide alerts when drivers encounter them. […] The accuracy of mapping data is becoming more important as driverless cars start taking to the road. It will be up to navigation apps to guide cars onto the safest routes and to warn passengers — who may not be paying attention — about potential hazards.
Journalists should talk about decade-old solutions for a well-known and studied problem, such that Tesla is clearly failing at safety 101 and this incident didn’t need to happen. Without fraud there would be no Tesla.
The news reporters in the video seem to be Tesla or Musk fans. Why do they continue to insist on calling this “auto pilot”?
The entire report seems to be excusing or lauding Tesla.
“Oh, I use it everyday” – “I’m taking a Tesla out for a spin with an expert on auto pilot”. This aggravates me to no end.
I’m glad you are continuing to cover these stories and exposing the fraud and how many people are being put in mortal danger. Whenever I see a Tesla on the road, I proceed with extreme caution, especially if they are approaching in the opposite lane. Thankfully, I live in an rural area and don’t see many Tesla cars.
I watched a few of the other videos – it appears the YouTuber AI Addict was fired for posting his video. It appears to me that the Tesla FSD is constantly failing and woefully lacking in all situations. i.e. Trying to pass a vehicle then realizing there is a slower vehicle in the passing lane. A human would have recognized this far earlier. As you have said repeatedly, the reaction of this FSD is too late. I take defensive driving course every three years (for the 10% insurance discount) and this FSD fails in so many ways. It just cannot anticipate danger, it is reactive.