Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
There's been 54 reported fatalities involving their software over the years.
That's around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.
Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high.
That equates to 1 fatal accident every 98 million miles.
The USA average per 100 million is 1.33 deaths, so even doubling the deaths it's less than the current national average.
-
Fair enough!
At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.
You're placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn't be redactions.
I didn't publish the software version data point because I agree with AA5B, it doesn't matter. I honestly don't care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.
I'm not a "Tesla reporter," I'm not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it's killing vulnerable road users, and for that analysis we don't actually need to know which self-driving system version is killing people, just the make of car it is installed on.
-
I'd say it's a pretty important distinction to know if one or both systems have a problem and the level of how bad that problem is.
Also are you referencing the one in Seattle in 2024 for FSD? The CNBC article says FSD, but the driver said AP.
And especially back then, there's also an important distinction of how they work.
FSD on highways wasn't released until November 2024, and even then not everyone got it right away. So even if FSD was enabled, the crash may have been under AP.
-
Calamari Racing Team. It's mostly a counter-movement to r/Motorcycles, where most of the posters are seen as anti-fun. Their whole thing is that, not just a specific way to ride, they also have a legendary commenter that pays money for pics in full leather.
-
Police report for 2024 case attached, it is also linked in the original article: https://www.opb.org/article/2025/01/15/tesla-may-face-less-accountability-for-crashes-under-trump/
It was Full Self Driving, according to the police. They know because they downloaded the data off the vehicle's computer. The motorcyclist was killed on a freeway merge ramp.
All the rest is beyond my brief. Thought you might like the data to chew on, though.
-
The motorcyclist was killed on a freeway merge ramp.
I'd say that means it's a very good chance that yes while, FSD was enabled, the crash happened under the older AP mode of driving.
Also yikes... the report says the AEB kicked in, and the driver overrode it by pressing on the accelerator!
-
There was an article where he sliced a deer in half
-
Okay, so I'm going to edit my earlier replies, but I'm wrong.
Version 11 in 2023 wasn't using the AP code, it just wasn't using the neural nets. So it was legitimately FSD, but it was running different code on the freeways (non neural net) vs on city streets (neural net)
But it was indeed FSD.
-
But muh innovation! How are genius CEOs supposed to innovate if they can't use the public at large as guinea pigs??
-
Cuz other self driving cars use LIDAR so it's basically impossible for them to not realise that a bike is there.
-
Claymore and trip wire?
-
And how did it compare self-driving time or miles? Because on the surface if Tesla is responsible for 5 such accidents and Ford zero, but Tesla has significantly more than five times the self-driving time or miles, then we just don’t have data yet …… and I see an announcement that Ford expects full self driving in 2026, so it can’t have been used much yet
-
It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.
-
It looks a great deal like a Royal Enfield, but I couldn't tell you which model. A Bullet, maybe?
-
That's the one! Thanks, that was un-googleable for me.
-
nice work, worth feeling a bit of pride over.
-
unless it's foggy, etc.