After 50 million miles, Waymos crash a lot less than human drivers
-
You’re not wrong, but arguably that doesn’t invalidate the point, they do drive better than humans because they’re so much better at judging their own limitations.
If human drivers refused to enter dangerous intersections, stopped every time things started yup look dangerous, and handed off to a specialist to handle problems, driving might not produce the mountain of corpses it does today.
That said, you’re of course correct that they still have a long way to go in technical driving ability and handling of adverse conditions, but it’s interesting to consider that simple policy effectively enforced is enough to cancel out all the advantages that human drivers currently still have.
driving might not produce the mountain of corpses it does today.
And people wouldn't be able to drive anywhere. Which could very well be a good thing, but still
-
They're not saying general road safety is 20x better. They're comparing an automated car ONLY on surface streets with lights, intersections, pedestrians, dogs, left turns, etc... to a professional truck driver mostly on highway miles.
That's fair. Comparing regular drivers doing typical city trips to commercial big rigs is a bit apples-and-oranges. I wonder how CDL data would compare when the self-driving semi-trucks start putting on miles. Aurora is about to launch in that exact space.
-
If you build it they will come
Because having a bus to pick up 7 people in a day is really efficient economically and environmentally...
-
@[email protected]
So let me make sure I understand your argument. Because nobody can be held liable for one hypothetical death of a child when an accident happens with a self driving car, we should ban them so that hundreds of real children can be killed instead. Is that what you are saying?
No, this strawman is obviously not my argument. It's curious you're asking whether you understand, and then opining afterwards, rather than waiting for the clarification you suggest you're seeking. When someone responds to a no-brainer suggestion, grounded in skepticism but perfectly sensible nevertheless, with a strawman seemingly crafted to discredit it, one has to wonder if that someone is writing in good faith. Are you?
For anyone who is reading in good faith: we're clearly not talking about one hypothetical death, since more than one real death involving driverless car technology has already occurred, and there is no doubt there will be more in the future given the nature of conducting a several-ton hunk of metal across public roads at speed.
It should go without saying that hypothetical auto wreck fatalities occurring prior to the deployment of technology are not the fault of everyone who delayed the deployment of that technology, meaning in particular that these hypothetical deaths do not justify hastening deployment. This is a false conflation regardless of how many times Marc Andreesen and his apostles preach variations of it.
Finally "ban", or any other policy prescription for that matter, appeared nowhere in my post. That's the invention of this strawman's author (you can judge for yourself what the purpose of such an invention might be). What I urge is honestly attending to the serious and deadly important moral and justice questions surrounding the deployment of this class of technology before it is fully unleashed on the world, not after. Unless one is so full up with the holy fervor of technoutopianism that one's rationality has taken leave, this should read as an anodyne and reasonable suggestion.I was asking in good faith because the way you talk is not easily comprehensible. I can barely follow whatever argument you are trying to make. I think you are trying to say that we shouldn't allow them on the road until we have fully decided who is at fault in an accident?
Also, only one death has occurred so far involving driverless cars, which is where a speeding SUV rammed into a stopped driverless car and then the SUV continued on and hit 5 other cars where it killed someone. That's it. The only death involved a driverless car sitting still, not moving, not doing anything... and it wasn't even the car that hit the car in which the person died. So I would say it is hypothetical when talking about hypothetical deaths that are the fault of a driverless car.
-
@ripcord unpredictable but maybe not standard practice? Just a guess, could be a bad assumption! British driving culture is reliant on eye contact and waves and nods and flashes - you have to signal if you're giving way (to other drivers as well), and say thank you; lots of places where there's only room for one vehicle on a two way road and someone has to decide who's going. Might be my failure of imagination but I don't know how that works with no driver.
It is absolutely common for people to do something unexpected in Las Vegas, particularly near the Strip and other pedestrian-heavy, gambling/drinking-heavy areas.
Erratic driving is also higher than average for most western cities.
-
Thanks, but I am not, others on the road however, abysmal.
I find the scariest people on the road to be the arrogant ones that think they make no mistakes.
-
*human drivers remotely controlling cars crash less than humans directly controlling cars
But it's not like that. There's some kind of ML involved but also like they had to map put their entire service area, etc. If something goes wrong, a human has to come up and drive your driverless car lmao
-
@dogslayeggs this is not a good solution unless you're expecting to mandate that all pedestrians, cyclists, scooter riders, guide dogs, whatever, wear them too, and that all existing cars are retrofitted with them. Kind of dystopian.
I was clearly only talking about cars, not pedestrians. Driverless cars have already shown they are pretty good at avoiding pedestrians and cyclists and scooters and dogs. Even in the case of the pedestrian hit by the Cruise car, that pedestrian was hit by another car first and then thrown into the path of the Cruise. The one case of a dog hit by a car was a dog running out from behind parked cars with no time for a human to stop, let alone the Waymo... and dogs don't usually wave and signal to drivers on the road.
As far as retrofitted cars, this is about improving the current system not requiring 100% compliance. Do you ban people from driving on the roads if they don't wave at you on a one-car wide road? No. So you don't have to ban cars that don't have this tech. But when more and more cars DO have the tech, then you get improvements over time.
-
That's fair. Comparing regular drivers doing typical city trips to commercial big rigs is a bit apples-and-oranges. I wonder how CDL data would compare when the self-driving semi-trucks start putting on miles. Aurora is about to launch in that exact space.
I'm honestly more scared of that. Professional CDL drivers are WAY better at driving than other people. But their trucks are way more dangerous and harder to handle. So putting driverless tech in that is going to be harder and more dangerous.
-
This post did not contain any content.
We always knew good quality self-driving tech would vastly outperform human skill. It's nice to see some decent metrics!
-
Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding "confusing" crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can't handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.I'm not blaming Waymo for doing it as safe as they can, that's great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.What's really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
I think "near ideal conditions" is a huge exaggeration. The situations Waymo avoids are a small fraction of the total mileage driven by Waymo vehicles or the humans they're being compared with. It's like you're saying a football team's stats are grossly wrong if they don't include punt returns.
-
I think the fair comparison would be humans that drive legally.
Idiots that drive high or drunk or without prescription glasses or whatever, shouldn't count as "normal" human driving.
In the same way a self driving car can have issues that will make it illegal.The problem is that legal self driving Tesla is not as safe as a legal person. I sees poorly at night, it gets confused in situations people handle routinely. And Tesla is infamous for not stopping when the road is blocked from 1m and up, and for breaking without reason. I've seen videos where they demonstrated an unnecessary break every ½ hour!! Where a large part was the German Autobahn, which is probably some of the easiest driving in the world!!
And Tesla is infamous … for breaking without reason.
No notes!
-
I live on a 40mph road with no sidewalk or shoulder. That is connected to a 45mph road with no sidewalk or shoulder. My nearest bus stop is 3.2 miles away.
I'm not even that far out, I can drive to a major city downtown in 30 minutes.
That's great that you have all this infrastructure around you, but not everyone does. Like you said, a lack of perspective is not an excuse.
That's not out of necessity. It's a design decision. You could have one nearby with the right elected officials and public effort. You also chose where to live, with the ability to know where existing stops are. If you chose the live away from a bus stop or other public transport then that's on you.
-
Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding "confusing" crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can't handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.I'm not blaming Waymo for doing it as safe as they can, that's great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.What's really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
RoboTaxis will also have to "navigate" the Fashla hate. Not many will be eager to risk their lives with them
-
Where? I haven't heard of any rail lines that don't have a human operator onboard or somewhere in the loop?
I was on the newly opened Thessaloniki (Greece) subway line and it was autonomous.
-
And Tesla is infamous … for breaking without reason.
No notes!
We discussed the test here on Lemmy a few days ago.
I can't find the video that was debated, but you can't be serious about not knowing about this issue?!?!?
It's a years old issue that is still not fixed!!!https://www.carscoops.com/2025/02/german-court-finds-teslas-autopilot-defective-after-lawsuit/
-
You are completely ignoring the under ideal circumstances part.
They can't drive at night AFAIK, they can't drive outside the area that is meticulously mapped out.
And even then, they often require human intervention.If you asked a professional driver to do the exact same thing, I'm pretty sure that driver would have way better accident record than average humans too.
Seems to me you are missing the point I tried to make. And is drawing a false conclusion based on comparing apples to oranges.
Waymo can absolutely drive at night, I’ve seen them do it. They rely heavily on LIDAR, so the time of day makes no difference to them.
And apparently they only disengage and need human assistance every 17,000 miles, on average. Contrast that to something like Tesla’s “Full Self Driving” (ignoring the controversy over whether it counts or not), where the most generous numbers I could find for it are a disengagement every 71 city miles, on average, or every 245 city miles for a “critical disengagement.”
You are correct in that Waymo is heavily geofenced, and that’s pretty annoying sometimes. I tried to ride one in Phoenix last year, but couldn’t get it to pick me up from the park I was visiting because I was just on the edge of their area. I suspect they would likely do fine if they went outside of their zones, but they really want to make sure they’re going to be successful so they’re deliberately slow-rolling where the service is available.
-
I find the scariest people on the road to be the arrogant ones that think they make no mistakes.
I hope this is a copy pasta lmao, if you actually go to a training course where you learn to handle oversteer, understeer and spin you out, they tell you that you have about a fuck all chance of recovering, even when there when you have warning and you know it's coming and you have a fairly low speed you have very little chance of counter steering correctly.
Here is what you actually have to do to drive safely:
1, dont be a dumbass that thinks you need to go through 12 years of Formula 1 training to drive on the road
2, dont be a dumbass and adjust your speed to driving conditions
3 dont be a dumbass and don't push the limits of your car on public roads
4, defensive driving, assume people on the road are idiots and will fuck up and drive accordingly.
5, learn how your car works, eg. just because you have an e-Handbrake you can still pull on it and it will stop the car
6, and most important, because people don't know how to do it, learn to emergency break, meaning your hazard lights come on.
-
Waymo can absolutely drive at night, I’ve seen them do it. They rely heavily on LIDAR, so the time of day makes no difference to them.
And apparently they only disengage and need human assistance every 17,000 miles, on average. Contrast that to something like Tesla’s “Full Self Driving” (ignoring the controversy over whether it counts or not), where the most generous numbers I could find for it are a disengagement every 71 city miles, on average, or every 245 city miles for a “critical disengagement.”
You are correct in that Waymo is heavily geofenced, and that’s pretty annoying sometimes. I tried to ride one in Phoenix last year, but couldn’t get it to pick me up from the park I was visiting because I was just on the edge of their area. I suspect they would likely do fine if they went outside of their zones, but they really want to make sure they’re going to be successful so they’re deliberately slow-rolling where the service is available.
Waymo can absolutely drive at night
True I just checked it up, my information was outdated.
-
This post did not contain any content.
As a techno-optimist, I always expected self-driving to quickly become safer than human, at least in relatively controlled situations. However I’m at least as much a pessimist of human nature and the legal system.
Given self-driving vehicles demonstrably safer than human, but not perfect, how can we get beyond humans taking advantage, and massive liability for the remaining accidents?