Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
+1 for you. However, replace "Regards" with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.
-
Don't waymos have remote drivers that take control in unexpected situationsml?
-
I feel like that as a driver. Tesla’s do not move at a consistent speed, which drives me mad
-
They have remote drivers that CAN take control in very corner case situations that the software can't handle. The vast majority of driving is don't without humans in the loop.
-
Elon needs to take responsibility for their death.
-
It can't even perceive the depth of the lights?
-
My state allowed motorcycle filtering in 2019 (not the same as California’s lane splitting). They ran a study and found a ton of motorcyclists were being severely injured or killed while getting rear ended sitting at stop lights. Filtering allows them to move to the front of the traffic light while the light is red and traffic is stationary. Many people are super aggravated about it even though most of the world has been doing it basically forever.
-
As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.
-
Let's get this out of the way: Felon Musk is a nazi asshole.
Anyway, It should be criminal to do these comparisons without showing human drivers statistics for reference. I'm so sick of articles that leave out hard data. Show me deaths per billion miles driven for tesla, competitors, and humans.
Then there's shit like the boca raton crash, where they mention the car going 100 in a 45 and killing a motorcyclist, and then go on to say the only way to do that is to physically use the gas pedal and that it disables emergency breaking. Is it really a self driving car at that point when a user must actively engage to disable portions of the automation? If you take an action to override stopping, it's not self driving. Stopping is a key function of how self driving tech self drives. It's not like the car swerved to another lane and nailed someone, the driver literally did this.
Bottom line I look at the media around self driving tech as sensationalist. Danger drives clicks. Felon Musk is a nazi asshole, but self driving tech isn't made by the guy. it's made by engineers. I wouldn't buy a tesla unless he has no stake in the business, but I do believe people are far more dangerous behind the wheel in basically all typical driving scenarios.
-
I assume older motorcycles built before 2003 are still legal in the EU today, and that the drivers' are responsible for turning on the lights when riding those.
-
Bro I'm colorblind too and if you're not sure what color the light is, you have to stop. Don't put that on the rest of us.
-
Ok, maybe project managers are good for something.
-
Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.
Human eyes are so far beyond it's hard to even quantify.
-
build, sell and drive
You two don't seem to strongly disagree. The driver is liable but should then sue the builder/seller for "self driving" fraud.
-
Trucks in general have gotten so big they are pedestrian deathtraps
-
Not with cameras alone, no.
-
"Critical Thinker" Yikes. Somehow the right made that a forbidden word in my mind because they hide behind that as an excuse for asking terrible questions etc.
Anyway. Allegedly the statistics are rather mediocre for self driving cars. But sadly I haven't seen a good statistic about that, either. The issue here is that automatable tasks are lower risk driving situations so having a good statistic is near impossible. E.g. miles driven are heavily skewed when you are only used on highways as a driver. There are no simple numbers that will tell you anything of worth.
That being said the title should be about the mistake that happened without fundamental statements (i.e. self driving is bad because motorcyclists die).
-
These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s
-
There seems to be people/bots down-voting critical takes up and down this very thread. What chumps.