Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
FTFA:
Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.
That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.
If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.
The guy admitted to being intoxicted and held the gas down... what's the self driving contribution to that?
-
Yeah, keep in mind that Elon couldn't get level 3 working in a closed, pre-mapped circuit. The robotaxis were just remotely operated.
-
I know what's in the article, boss. I wrote it. No need to tell me FTFA.
TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.
I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.
Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.
-
Sure, we're in agreement as far as that goes. My point was just the commenter above me was indicating it should be common knowledge that Tesla self driving hits motorcycles more than other self driving cars. And whether their comment was about this or some other subject, I think it's counterproductive to be like "everyone knows that."
-
So do you expect self driving tech to override human action? or do you expect human action to override self driving tech?
I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that's my assumption.
With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn't have happened. It wouldn't have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.
I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I'd trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.
-
I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.
Just be sure to carefully watch your six when you're sitting at a stoplight. I've gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I'll have to scoot out of the way of some imbecile who's coming in hot. That's hard to do when your front tire is 24" away from the license plate of the car in front of you.
-
For me it depends which bike I'm riding. If it's my 49cc scooter, I'll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I'll just filter to the front (legal in Utah).
-
I would assume everyone here would agree with that
-
We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.
-
human driving cars still target bicyclists on purpose so i don’t know see how teslas could be any worse…
p.s. painting a couple lines on the side of the road does not make a safe bike lane… they need a physical barrier separating the road from them… like how curbs separate the road from sidewalks…
-
Remember, you have the right to self-defence, against both rogue robots and rogue humans.
-
teslas aren't even worthy of the designation "self-driving". They use cheap cameras instead of LIDAR. It should be illegal to call such junk "self-driving".
-
The video 0x0 linked to in another comment describes the likely method used to infer distance to objects without a stereoscopic setup, and why it (likely) had issues determining distance in the cases where they hit motorcycles.
-
I think i had a stroke reading that. Take your upvote and get out!
-
Propane cylinder. Mutually assured destruction.
-
Those are ways to gather empirical results, though they rely on artificial, staged situations.
I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. It would not be markedly better
-
I mean, maybe, but previously when I've said that it's typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it's somehow suddenly too dangerous to allow owners to control their property just because software is involved.