Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.
Tesla alleges they'll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We'll see.
wrote 8 days ago last edited byYeah, keep in mind that Elon couldn't get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.
-
FTFA:
Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.
That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.
If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.
The guy admitted to being intoxicted and held the gas down... what's the self driving contribution to that?
wrote 8 days ago last edited byI know what's in the article, boss. I wrote it. No need to tell me FTFA.
TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.
I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.
Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.
-
like regulators not allowing dangerous products,
I include human drivers in the list of dangerous products I don't want allowed. The question is self driving safer overall (despite possible regressions like this). I don't want regulators to pick favorites. I want them to find "the truth"
wrote 8 days ago last edited bySure, we're in agreement as far as that goes. My point was just the commenter above me was indicating it should be common knowledge that Tesla self driving hits motorcycles more than other self driving cars. And whether their comment was about this or some other subject, I think it's counterproductive to be like "everyone knows that."
-
I know what's in the article, boss. I wrote it. No need to tell me FTFA.
TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.
I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.
Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.
wrote 8 days ago last edited bySo do you expect self driving tech to override human action? or do you expect human action to override self driving tech?
I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that's my assumption.
With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn't have happened. It wouldn't have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.
I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I'd trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.
-
Good to know, I'll stay away from those damn things when I ride.
wrote 8 days ago last edited byI already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.
Just be sure to carefully watch your six when you're sitting at a stoplight. I've gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I'll have to scoot out of the way of some imbecile who's coming in hot. That's hard to do when your front tire is 24" away from the license plate of the car in front of you.
-
I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.
Just be sure to carefully watch your six when you're sitting at a stoplight. I've gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I'll have to scoot out of the way of some imbecile who's coming in hot. That's hard to do when your front tire is 24" away from the license plate of the car in front of you.
wrote 8 days ago last edited byFor me it depends which bike I'm riding. If it's my 49cc scooter, I'll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I'll just filter to the front (legal in Utah).
-
Maybe, if that two-step determination of liability is really what the parent commenter had in mind.
I'm not so sure he'd agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.
wrote 8 days ago last edited byI would assume everyone here would agree with that
-
It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.
wrote 8 days ago last edited byWe frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.
-
I imagine bicyclists must be effected as well if they're on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.
Time to go to Netherlands.
wrote 8 days ago last edited byhuman driving cars still target bicyclists on purpose so i don’t know see how teslas could be any worse…
p.s. painting a couple lines on the side of the road does not make a safe bike lane… they need a physical barrier separating the road from them… like how curbs separate the road from sidewalks…
-
TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
- The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
- This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
- The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
wrote 8 days ago last edited byRemember, you have the right to self-defence, against both rogue robots and rogue humans.
-
wrote 8 days ago last edited by
teslas aren't even worthy of the designation "self-driving". They use cheap cameras instead of LIDAR. It should be illegal to call such junk "self-driving".
-
That seems like a spectacular oversight. How is it supposed to replicate human vision without depth perception?
wrote 8 days ago last edited byThe video 0x0 linked to in another comment describes the likely method used to infer distance to objects without a stereoscopic setup, and why it (likely) had issues determining distance in the cases where they hit motorcycles.
-
TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
- The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
- This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
- The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
wrote 8 days ago last edited byMakes sense, statistically smaller sample to be trained on, relatively easy fix, just retrain with more motorcycles in the data.
-
Remember, you have the right to self-defence, against both rogue robots and rogue humans.
wrote 8 days ago last edited byHow you plan to self defend against a vehicle?
-
Affectively, does it realy mater if someone has slite misstakes in there righting?
wrote 8 days ago last edited byI think i had a stroke reading that. Take your upvote and get out!
-
wrote 8 days ago last edited by
Propane cylinder. Mutually assured destruction.
-
We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.
wrote 8 days ago last edited byThose are ways to gather empirical results, though they rely on artificial, staged situations.
I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. It would not be markedly better
-
I would assume everyone here would agree with that
wrote 8 days ago last edited byI mean, maybe, but previously when I've said that it's typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it's somehow suddenly too dangerous to allow owners to control their property just because software is involved.
-
wrote 8 days ago last edited by
Shouldn't be an issue if drivers used it as a more advanced cruise control. Unless there is catastrophic mechanical or override failure, these things will always be the driver's fault.
-
For me it depends which bike I'm riding. If it's my 49cc scooter, I'll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I'll just filter to the front (legal in Utah).
wrote 8 days ago last edited byI filter to the front on my leg powered bike, most traffic light setups here have a region for bikes at the front of the cars.