Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
- The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
- This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
- The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
wrote 9 days ago last edited bySounds like NHTSA needs a visit from DOGE!
-
Stop dehumanizing drivers who killed people.
Feature, wrongly called, Full Self-Driving, shall be supervised at any time.wrote 9 days ago last edited byI think it's important to call out inattentive drivers while also calling out the systems and false advertising that may lead them to become less attentive.
If these systems were marketed as "driver assistance systems" instead of "full self driving", certainly more people would pay attention. The fact that they've been allowed to get away with this blatant false advertising is astonishing.
They're also obviously not adequately monitoring for driver attentiveness.
-
There's at least two steps before those three:
-1. Society has been built around the needs of the auto industry, locking people into car dependency
- A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
wrote 9 days ago last edited by- A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
That's a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.
You're absolutely right about point -1 though.
-
wrote 9 days ago last edited by
Because I do journalism, and sometimes I even do good journalism!
In that case, you wouldn't happen to know whether or not Teslas are unusually dangerous to bicycles too, would you?
-
Because the march of technological advancement is inevitable?
In light of recent (and let's face it, long ago cases) Tesla's "Full Self Driving" needs to be downgraded to level 2 at best.
Level 2: Partial Automation
The vehicle can handle both steering and acceleration/deceleration, but the driver must remain engaged and ready to take control.
Pretty much the same level as other brands self driving feature.
wrote 9 days ago last edited byThe other brands, such as Audi and VW, work much better than Tesla's system. Their LIDAR systems aren't blinded by fog, and rain the way the Tesla is. Someone recently tested an Audi with its system against a Tesla with its system. The Tesla failed either 3/5 or 4/5 tests. The Audi passed 3/5 or 4/5. Neither system is perfect, but the one that doesn't rely on just cameras is clearly superior.
Edit: it was Mark Rober.
-
Why is self-driving even allowed?
wrote 9 days ago last edited byBecause the only thing worse than self driving is human driving.
-
It's helpful to remember that not everyone has seen the same stories you have. If we want something to change, like regulators not allowing dangerous products, then raising public awareness is important. Expressing surprise that not everyone knows about something can be counterproductive.
Going beyond that, wouldn't the new information here be the statistics?
wrote 9 days ago last edited bylike regulators not allowing dangerous products,
I include human drivers in the list of dangerous products I don't want allowed. The question is self driving safer overall (despite possible regressions like this). I don't want regulators to pick favorites. I want them to find "the truth"
-
Tesla's argument of "well human eyes are like cameras therefore we shouldn't use LiDAR" is so fucking dumb.
Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.
And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.
wrote 9 days ago last edited byAnyone who has driven (or walked) into a sunrise/sunset knows that human vision is not very good. I've also driven in blizzards, heavy rain, and fog - all times when human vision is terrible. I've also not seen green lights (I'm colorblind).
-
The other brands, such as Audi and VW, work much better than Tesla's system. Their LIDAR systems aren't blinded by fog, and rain the way the Tesla is. Someone recently tested an Audi with its system against a Tesla with its system. The Tesla failed either 3/5 or 4/5 tests. The Audi passed 3/5 or 4/5. Neither system is perfect, but the one that doesn't rely on just cameras is clearly superior.
Edit: it was Mark Rober.
wrote 9 days ago last edited byIt's hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.
The LIDAR system in Mark's video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.
Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.
Please do not mistake this comment as "AI/computer vision" evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else's to that system.
-
It's hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.
The LIDAR system in Mark's video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.
Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.
Please do not mistake this comment as "AI/computer vision" evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else's to that system.
wrote 9 days ago last edited byThe way I understand it, is that Audi and VW have had the hardware in place for a few years. They are collecting real world data about how we drive before they allow the systems to be used at all. There are also legal issues with liability.
-
Why is self-driving even allowed?
wrote 9 days ago last edited byRobots don't get drunk, or distracted, or text, or speed...
Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.
-
Every captcha.....can you see the motorcycle? I would be afraid if they wanted all the squares with small babies or maybe just regular folk...can you pick all the hottie's? Which of these are body parts?
wrote 9 days ago last edited by -
Hey guys relax! It's all part of the learning experience of Tesla FSD.
Some of you may die, but that a sacrifice I'm willing to make.Regards
Elon Musk
CEO of Teslawrote 8 days ago last edited by+1 for you. However, replace "Regards" with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.
-
Robots don't get drunk, or distracted, or text, or speed...
Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.
wrote 8 days ago last edited byDon't waymos have remote drivers that take control in unexpected situationsml?
-
as daily rider, i must add having a tesla behind to the list of road hazards to look out
wrote 8 days ago last edited byI feel like that as a driver. Tesla’s do not move at a consistent speed, which drives me mad
-
Don't waymos have remote drivers that take control in unexpected situationsml?
wrote 8 days ago last edited byThey have remote drivers that CAN take control in very corner case situations that the software can't handle. The vast majority of driving is don't without humans in the loop.
-
TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
- The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
- This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
- The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
wrote 8 days ago last edited byElon needs to take responsibility for their death.
-
This is news? Fortnine talked about it two years ago.
wrote 8 days ago last edited byIt can't even perceive the depth of the lights?
-
It's helpful to remember that not everyone has seen the same stories you have. If we want something to change, like regulators not allowing dangerous products, then raising public awareness is important. Expressing surprise that not everyone knows about something can be counterproductive.
Going beyond that, wouldn't the new information here be the statistics?
wrote 8 days ago last edited byMy state allowed motorcycle filtering in 2019 (not the same as California’s lane splitting). They ran a study and found a ton of motorcyclists were being severely injured or killed while getting rear ended sitting at stop lights. Filtering allows them to move to the front of the traffic light while the light is red and traffic is stationary. Many people are super aggravated about it even though most of the world has been doing it basically forever.
-
How about we disallow it completely, until it's proven to be SAFER than a human driver. Because, why even allow it if it's only as safe?
wrote 8 days ago last edited byAs an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.