Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
I imagine bicyclists must be effected as well if they're on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.
Time to go to Netherlands.
wrote 8 days ago last edited by*affected
-
Elon needs to take responsibility for their death.
wrote 8 days ago last edited byThat's why Tesla's full self driving is officially still a level 2 cruise control. But of course they promise to jump directly to level 4 soon
.
-
wrote 8 days ago last edited by
Thank you for your service.
-
wrote 8 days ago last edited by
WHY CAN'T WE JUST HAVE PUBLIC TRANSIT, FUCK! TRAINS EXIST!
-
wrote 8 days ago last edited by
Why? Crash rates for Self-Driving Cars (when adjusted for crash severity) are lower.
Removing sensors to save costs on self driving vehicles should be illegal
-
wrote 8 days ago last edited by
Affectively, does it realy mater if someone has slite misstakes in there righting?
-
+1 for you. However, replace "Regards" with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.
wrote 8 days ago last edited byYes I'm not writing that shit, even in a sarcastic post. Bu I get your drift.
On the other hand VW group is absolutely killing it on EV recently IMO.
They totally dominate top 10 EV here in Denmark, with 7 out of 10 top selling models!!
They are competitively priced, and they are the best combination of quality and range in their price ranges. -
Because I do journalism, and sometimes I even do good journalism!
In that case, you wouldn't happen to know whether or not Teslas are unusually dangerous to bicycles too, would you?
wrote 8 days ago last edited bySurprisingly, there is a data bucket for accidents with bicyclists, but hardly any bicycle crashes are reported.
That either means that they are not occurring (woohoo!), or that means they are being lumped in as one of the multiple pedestrian buckets (not woohoo!), or they are in the absolutely fucking vast collection of "severity: unknown" accidents where we have no details and Tesla requested redaction to make finding the details very difficult.
-
"Critical Thinker" Yikes. Somehow the right made that a forbidden word in my mind because they hide behind that as an excuse for asking terrible questions etc.
Anyway. Allegedly the statistics are rather mediocre for self driving cars. But sadly I haven't seen a good statistic about that, either. The issue here is that automatable tasks are lower risk driving situations so having a good statistic is near impossible. E.g. miles driven are heavily skewed when you are only used on highways as a driver. There are no simple numbers that will tell you anything of worth.
That being said the title should be about the mistake that happened without fundamental statements (i.e. self driving is bad because motorcyclists die).
wrote 8 days ago last edited byDid I ask a terrible question, or do you just not like anything being objective about the issue? I'm so far over on the left side ideologically that you'd be hard pressed finding an issue that i'm conservative on. I don't fit the dem mold though, i'm more of a bernie.... though I am very critical in general. I don't just take things at face value. Anywho...
Saying that the statistics aren't great just lends credence to the fact that we can't objectively determine how safe or unsafe anything is without good data.
-
Let's get this out of the way: Felon Musk is a nazi asshole.
Anyway, It should be criminal to do these comparisons without showing human drivers statistics for reference. I'm so sick of articles that leave out hard data. Show me deaths per billion miles driven for tesla, competitors, and humans.
Then there's shit like the boca raton crash, where they mention the car going 100 in a 45 and killing a motorcyclist, and then go on to say the only way to do that is to physically use the gas pedal and that it disables emergency breaking. Is it really a self driving car at that point when a user must actively engage to disable portions of the automation? If you take an action to override stopping, it's not self driving. Stopping is a key function of how self driving tech self drives. It's not like the car swerved to another lane and nailed someone, the driver literally did this.
Bottom line I look at the media around self driving tech as sensationalist. Danger drives clicks. Felon Musk is a nazi asshole, but self driving tech isn't made by the guy. it's made by engineers. I wouldn't buy a tesla unless he has no stake in the business, but I do believe people are far more dangerous behind the wheel in basically all typical driving scenarios.
wrote 8 days ago last edited byIn Boca Raton, I've seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.
Insanely, you can slam on the gas in Tesla's self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle's "traffic aware" automation effectively applying a brake.
That's not sensationalist. That really is just insanely designed.
-
It's hard to tell, but from about 15 minutes of searching, I was unable to locate any consumer vehicles that include a LIDAR system. Lots of cars include RADAR, for object detection, even multiple RADAR systems for parking. There may be some which includes a TimeOfFlight sensor, which is like LIDAR, but static and lacks the resolution/fidelity. My Mach-E which has level 2 automation uses a combination of computer vision, RADAR and GPS. I was unable to locate a LIDAR sensor for the vehicle.
The LIDAR system in Mark's video is quite clearly a pre-production device that is not affiliated with the vehicle manufacturer it was being tested on.
Adding, after more searching, it looks like the polestar 3, some trim levels of the Audi A8 and the Volvo EX90 include a LiDAR sensor. Curious to see how the consumer grade tech works out in real world.
Please do not mistake this comment as "AI/computer vision" evangelisim. I currently have a car that uses those technologies for automation, and I would not and do not trust my life or anyone else's to that system.
wrote 8 days ago last edited byMercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.
Tesla alleges they'll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We'll see.
-
In Boca Raton, I've seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.
Insanely, you can slam on the gas in Tesla's self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle's "traffic aware" automation effectively applying a brake.
That's not sensationalist. That really is just insanely designed.
wrote 8 days ago last edited byFTFA:
Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.
That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.
If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.
The guy admitted to being intoxicted and held the gas down... what's the self driving contribution to that?
-
Mercedes uses LiDAR. They also operate the sole Level 3 driver automation system in the USA. Two models only, the new S-Class and EQS sedans.
Tesla alleges they'll be Level 4+ in Austin in 60 days, and just skip Level 3 altogether. We'll see.
wrote 8 days ago last edited byYeah, keep in mind that Elon couldn't get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.
-
FTFA:
Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.
That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.
If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.
The guy admitted to being intoxicted and held the gas down... what's the self driving contribution to that?
wrote 8 days ago last edited byI know what's in the article, boss. I wrote it. No need to tell me FTFA.
TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.
I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.
Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.
-
like regulators not allowing dangerous products,
I include human drivers in the list of dangerous products I don't want allowed. The question is self driving safer overall (despite possible regressions like this). I don't want regulators to pick favorites. I want them to find "the truth"
wrote 8 days ago last edited bySure, we're in agreement as far as that goes. My point was just the commenter above me was indicating it should be common knowledge that Tesla self driving hits motorcycles more than other self driving cars. And whether their comment was about this or some other subject, I think it's counterproductive to be like "everyone knows that."
-
I know what's in the article, boss. I wrote it. No need to tell me FTFA.
TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.
I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.
Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.
wrote 8 days ago last edited bySo do you expect self driving tech to override human action? or do you expect human action to override self driving tech?
I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that's my assumption.
With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn't have happened. It wouldn't have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.
I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I'd trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.
-
Good to know, I'll stay away from those damn things when I ride.
wrote 8 days ago last edited byI already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.
Just be sure to carefully watch your six when you're sitting at a stoplight. I've gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I'll have to scoot out of the way of some imbecile who's coming in hot. That's hard to do when your front tire is 24" away from the license plate of the car in front of you.
-
I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.
Just be sure to carefully watch your six when you're sitting at a stoplight. I've gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I'll have to scoot out of the way of some imbecile who's coming in hot. That's hard to do when your front tire is 24" away from the license plate of the car in front of you.
wrote 8 days ago last edited byFor me it depends which bike I'm riding. If it's my 49cc scooter, I'll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I'll just filter to the front (legal in Utah).
-
Maybe, if that two-step determination of liability is really what the parent commenter had in mind.
I'm not so sure he'd agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.
wrote 8 days ago last edited byI would assume everyone here would agree with that