Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
jonne@infosec.pubreplied to Guest 4 days ago last edited by
Whatever it is, it's unacceptable and they should really ban Tesla's implementation until they fix some fundamental issues.
-
Lidar needs to be a mandated requirement for these systems.
-
kayleadfoot@fedia.ioreplied to Guest 4 days ago last edited by
I also saw that theory! That's in the first link in the article.
The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.
I didn't include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!
The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a "standard" bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.
I think you're onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That's why Tesla would be alone in the motorcycle fatality bracket, and that's why it would always be rear-end crashes by the Tesla.
-
treadful@lemmy.zipreplied to Guest 4 days ago last edited by
Still probably a good idea to keep an eye on that Tesla behind you. Or just let them past.
-
kayleadfoot@fedia.ioreplied to Guest 4 days ago last edited by
... Also accurate.
God, it really is a nut punch. The system detects the crash is imminent.
Rather than automatically try to evade... the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.
-
jonne@infosec.pubreplied to Guest 4 days ago last edited by
Yep, that one was purely about hitting a certain KPI of 'miles driven on autopilot without incident'. If it turns off before the accident, technically the driver was in control and to blame.
-
excessshiv@lemmy.dbzer0.comreplied to Guest 4 days ago last edited by
The ridiculous thing is, it has 3 cameras pointing forward, you only need 2 to get stereoscopic depth perception with cameras...why the fuck are they not using that!?
-
neonachtwaechter@lemmy.worldreplied to Guest 4 days ago last edited by
Even when it is just milliseconds before the crash, the computer turns itself off.
Later, Tesla brags that the autopilot was turned off during this ( terribly, overwhelmingly) unfortunate accident.
-
neonachtwaechter@lemmy.worldreplied to Guest 4 days ago last edited by
Even when it is just milliseconds before the crash, the computer turns itself off.
Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.
-
neonachtwaechter@lemmy.worldreplied to Guest 4 days ago last edited by
so it won't show up in the stats
Hopefully they wised up by now and record these stats properly....?
-
neonachtwaechter@lemmy.worldreplied to Guest 4 days ago last edited by
Are you saying Harley drivers are fair game?
-
buffalox@lemmy.worldreplied to Guest 4 days ago last edited by
Hey guys relax! It's all part of the learning experience of Tesla FSD.
Some of you may die, but that a sacrifice I'm willing to make.Regards
Elon Musk
CEO of Tesla -
Most frustrating thing is, as far as I can tell, Tesla doesn't even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?
-
jonne@infosec.pubreplied to Guest 4 days ago last edited by
If they ever fixed it, I'm sure Musk fired whomever is keeping score now. He's going to launch the robotaxi stuff soon and it's going to kill a bunch of people.
-
kayleadfoot@fedia.ioreplied to Guest 4 days ago last edited by
NHTSA collects data if self-driving tech was active within 30 seconds of the impact.
The companies themselves do all sorts of wildcat shit with their numbers. Tesla's claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that's what they say on their stock earnings calls. Of course, that's not true, not based on any data I've seen, they haven't published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).
-
thegrandnagus@lemmy.worldreplied to Guest 4 days ago last edited by
Tesla's argument of "well human eyes are like cameras therefore we shouldn't use LiDAR" is so fucking dumb.
Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.
And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.
-
lnxtx@feddit.nlreplied to Guest 4 days ago last edited by
Stop dehumanizing drivers who killed people.
Feature, wrongly called, Full Self-Driving, shall be supervised at any time. -
expatriado@lemmy.worldreplied to Guest 4 days ago last edited by
as daily rider, i must add having a tesla behind to the list of road hazards to look out
-
neonachtwaechter@lemmy.worldreplied to Guest 4 days ago last edited by
P.S. Volunteers needed for the Mars mission as well.
15/200