Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
I assume older motorcycles built before 2003 are still legal in the EU today, and that the drivers' are responsible for turning on the lights when riding those.
-
Bro I'm colorblind too and if you're not sure what color the light is, you have to stop. Don't put that on the rest of us.
-
Ok, maybe project managers are good for something.
-
Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.
Human eyes are so far beyond it's hard to even quantify.
-
build, sell and drive
You two don't seem to strongly disagree. The driver is liable but should then sue the builder/seller for "self driving" fraud.
-
Trucks in general have gotten so big they are pedestrian deathtraps
-
Not with cameras alone, no.
-
"Critical Thinker" Yikes. Somehow the right made that a forbidden word in my mind because they hide behind that as an excuse for asking terrible questions etc.
Anyway. Allegedly the statistics are rather mediocre for self driving cars. But sadly I haven't seen a good statistic about that, either. The issue here is that automatable tasks are lower risk driving situations so having a good statistic is near impossible. E.g. miles driven are heavily skewed when you are only used on highways as a driver. There are no simple numbers that will tell you anything of worth.
That being said the title should be about the mistake that happened without fundamental statements (i.e. self driving is bad because motorcyclists die).
-
These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s
-
There seems to be people/bots down-voting critical takes up and down this very thread. What chumps.
-
This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?
-
So they say
-
That seems like a spectacular oversight. How is it supposed to replicate human vision without depth perception?
-
It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.
-
He may not be an engineer, but he's the one who made the decision to use strictly cameras rather than lidar, so yes, he's responsible for these fatalities that other companies don't have. You may not be a fan of Musk, but it sounds like you're a fan of Tesla
-
can you pick all the hottie’s?
... the hottie's what?
-
No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent
-
Good to know, I'll stay away from those damn things when I ride.
-
Maybe, if that two-step determination of liability is really what the parent commenter had in mind.
I'm not so sure he'd agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.