Self-Driving Teslas Are Fatally Striking Motorcyclists More Than Any Other Brand: New Analysis
-
human driving cars still target bicyclists on purpose so i don’t know see how teslas could be any worse…
p.s. painting a couple lines on the side of the road does not make a safe bike lane… they need a physical barrier separating the road from them… like how curbs separate the road from sidewalks…
-
Remember, you have the right to self-defence, against both rogue robots and rogue humans.
-
teslas aren't even worthy of the designation "self-driving". They use cheap cameras instead of LIDAR. It should be illegal to call such junk "self-driving".
-
The video 0x0 linked to in another comment describes the likely method used to infer distance to objects without a stereoscopic setup, and why it (likely) had issues determining distance in the cases where they hit motorcycles.
-
I think i had a stroke reading that. Take your upvote and get out!
-
Propane cylinder. Mutually assured destruction.
-
Those are ways to gather empirical results, though they rely on artificial, staged situations.
I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. It would not be markedly better
-
I mean, maybe, but previously when I've said that it's typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it's somehow suddenly too dangerous to allow owners to control their property just because software is involved.
-
I filter to the front on my leg powered bike, most traffic light setups here have a region for bikes at the front of the cars.
-
For what it's worth, it really isn't clear if this is FSD or AP based on the constant mention of self driving even when it's older collisions when it would definitely been AP.
So these may all be AP, or one or two might be FSD, it's unclear.
Every Tesla has AP as well, so the likelihood of that being the case is higher.
-
This is another reason I’ll never drive a motorcycle. Fuck that shit.
-
They don't even do that.
They can suggest what the car should do, but they aren't actually doing it. The car is in complete control.
Its a nuanced difference, but it is a difference. A Waymo employee never takes control or operates the vehicle.
-
Interesting! I did not know that - I assumed the teleoperators took direct control, but that makes much more sense for latency reasons (among others)
-
It's like smoking: if you haven't started, don't XD
-
I always just assumed it was their way to ensure the vehicle was really autonomous. If you have someone remotely driving it, you could argue it isn't actually an AV. Your latency idea makes a lot of sense as well though. Imagine taking over and causing an accident due to latency? This way even if the operator gives a bad suggestion, it was the car that ultimately did it.
-
Unless it's a higher rate than human drivers per mile or hours driven I do not care. Article doesn't have those stats so it's clickbait as far as I'm concerned