Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
Cybertrucks have 17 times the mortality rate of the ford pinto.
https://www.motherjones.com/politics/2025/02/report-cybertruck-safety-ford-pinto/
I wrote the original analysis Mother Jones is citing there. Hah, how about that! Delights me to see it cited in the wild.
-
Same goes for the other vehicles. They didn’t even try to cover miles driven and it’s quite likely Tesla has far more miles of self-driving than anyone else.
I’d even go so far as to speculate the zero accidents of other self-driving vehicles could just be zero information because we don’t have enough information to call it zero
No, the zero accidents for other self-driving vehicles is actually zero
You may have heard of this little boutique automotive manufacturer, Ford Motor Company. They're one of the primary competitors, and they are far above the mileage where you would expect a fatal accident if they were as safe as a human.
Ford has reported self-driving crashes (many of them!). Just no fatal crashes involving motorcycles, because I guess they don't fucking suck at making self-driving software.
I linked the data, it's all public governmental data, and only the Tesla crashes are heavily redacted. You could... IDK... read it, and then share your opinion about it?
-
as daily rider, i must add having a tesla behind to the list of road hazards to look out
-
Bahaha, that one is new to me.
Back when I worked on an ambulance, we called the no helmet guys organ donors.This comment was brought to you by PTSD, and has been redacted in a rare moment of sobriety.
-
NHTSA collects data if self-driving tech was active within 30 seconds of the impact.
The companies themselves do all sorts of wildcat shit with their numbers. Tesla's claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that's what they say on their stock earnings calls. Of course, that's not true, not based on any data I've seen, they haven't published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).
Fascinating! I don’t know all this. Thanks
-
Propane cylinder. Mutually assured destruction.
It will do nothing. By the time a propane cylinder would rupture, even if we assume it actually ignites too, it would add very little to a massive crash that killed everyone and desintegrated everything.
-
TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
- The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
- This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
- The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
-
I remember finding a motorcycle community on reddit that called themselves "squids" or "squiddies" or something like that.
Their whole thing was putting road tyres on dirtbikes and riding urban environments like they were offroad obstacles. You know, ramping things, except on concrete.
They loved to talk about how dumb & short-lived they were. I couldn't ever find that group again, so maybe I misremembered the "squid" name, but I wanted to find them again, not to ever try it - fuck that - but because the bikes looked super cool. I just have a thing for gender-bent vehicles.
-
I mean, maybe, but previously when I've said that it's typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it's somehow suddenly too dangerous to allow owners to control their property just because software is involved.
-
Is musk going, because I vote to be on whatever planet he isn't.
-
Negative. I'm a meat popsicle.
-
I wonder if a state court judge could mandate its use as unsafe?
They are illegal in every developed country.
-
they originally had lidar, or radar, but musk had them disabled in the older models.
They had radar. Tesla has never had lidar, but they do use lidar on test vehicles to ground truth their camera depth / velocity calculations.
-
So to drive with FSD is 8x safer than your average human driver.
WITH a supervising human.
Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.
Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human.
-
Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.
The range on ultrasonics is too short. They only ever get used for parking type situations, not driving on the roadways.
-
We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.
You mean like this Euro NCAP testing, where Tesla does stop and most others don't?
-
This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?
There's been 54 reported fatalities involving their software over the years.
That's around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.
Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high.
That equates to 1 fatal accident every 98 million miles.
The USA average per 100 million is 1.33 deaths, so even doubling the deaths it's less than the current national average.
-
In this case, does it matter? Both are supposed to follow a vehicle at a safe distance
I think it does matter, while both are supposed to follow at safe distances, the FSD stack is doing it in a completely different way. They haven't really been making any major updates to AP for many years now, all focus has been on FSD.
AP is looking at the world frame by frame, each individual camera on it's own, while FSD is taking the input of all cameras, turning into 3d vector space, and then driving based off that. Doing that on city streets and highways is only a pretty recent development. Updates for doing it this way on highway and streets only went out to all cars in the past few months. For along time it was on city streets only.
I’d be more interested in how it changes over time, as new software is pushed.
I think that's why it's important to make a real distinction between AP and FSD today (and specifically which FSD versions)
They're wholly different systems, one that gets older every day, and one that keeps getting better every few months. Making an article like this that groups them together muddies the water on what / if any progress has been made.
Fair enough!
At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.
You're placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn't be redactions.
I didn't publish the software version data point because I agree with AA5B, it doesn't matter. I honestly don't care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.
I'm not a "Tesla reporter," I'm not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it's killing vulnerable road users, and for that analysis we don't actually need to know which self-driving system version is killing people, just the make of car it is installed on.
-
Fair enough!
At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.
You're placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn't be redactions.
I didn't publish the software version data point because I agree with AA5B, it doesn't matter. I honestly don't care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.
I'm not a "Tesla reporter," I'm not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it's killing vulnerable road users, and for that analysis we don't actually need to know which self-driving system version is killing people, just the make of car it is installed on.
I'd say it's a pretty important distinction to know if one or both systems have a problem and the level of how bad that problem is.
Also are you referencing the one in Seattle in 2024 for FSD? The CNBC article says FSD, but the driver said AP.
And especially back then, there's also an important distinction of how they work.
FSD on highways wasn't released until November 2024, and even then not everyone got it right away. So even if FSD was enabled, the crash may have been under AP.
-
I remember finding a motorcycle community on reddit that called themselves "squids" or "squiddies" or something like that.
Their whole thing was putting road tyres on dirtbikes and riding urban environments like they were offroad obstacles. You know, ramping things, except on concrete.
They loved to talk about how dumb & short-lived they were. I couldn't ever find that group again, so maybe I misremembered the "squid" name, but I wanted to find them again, not to ever try it - fuck that - but because the bikes looked super cool. I just have a thing for gender-bent vehicles.