After 50 million miles, Waymos crash a lot less than human drivers
-
Focusing on airbag-deployments and injuries ignores the obvious problem: these things are unbelievably unsafe for pedestrians and bicyclists. I curse SF for allowing AVs and always give them a wide berth because there's no way to know if they see you and they'll often behave erratically and unpredictably in crosswalks. I don't give a shit how often the passengers are injured, I care a lot more how much they disrupt life for all the people who aren't paying Waymo for the privilege.
So the fact that after 50 million miles of driving there have been no pedestrian or cyclist deaths means they are unbelievably unsafe for pedestrians and cyclists? As far as I can tell, the only accidents involving pedestrians or cyclists AT ALL after 50 million miles is when a Waymo struck a plastic crate that careened into another lane where a scooter ran into it. And yet in your mind they are unbelievably unsafe?
-
The claim is that the remote operators do not actually drive the cars. However, they do routinely "assist" the system, not just step in when there's an emergency.
I think they've got 1 person watching dozens of cars though, it's not 1 per car like if there was human drivers.
-
I hate felon musk but I honestly believe their self driving tech is safer than humans.
Have you seen the average human? They're beyond dumb. If they're in cars it's like the majority of htem are just staring at their cell phones.
I don't think self driving tech works in all circumstances, but I bet it is already much better than humans at most driving, especially highway driving.
Your username is a lie huh?
-
Could a navigator run you over twice from different companies after they get fired from the first one?
Sequel to snowcrash right there
-
So let me make sure I understand your argument. Because nobody can be held liable for one hypothetical death of a child when an accident happens with a self driving car, we should ban them so that hundreds of real children can be killed instead. Is that what you are saying?
As far as I know of, Waymo has only been involved in one fatality. The Waymo was sitting still at a red light in traffic when a speeding SUV (according to reports going at extreme rate of speed) rammed it from behind into other cars. The SUV then continued into traffic where it struck more cars, eventually killing someone. That's the only fatal accident Waymo has been involved in after 50 million miles of driving. But instead of making it safer for children, you would prefer more kids die just so you have someone to blame?
@[email protected]So let me make sure I understand your argument. Because nobody can be held liable for one hypothetical death of a child when an accident happens with a self driving car, we should ban them so that hundreds of real children can be killed instead. Is that what you are saying?
No, this strawman is obviously not my argument. It's curious you're asking whether you understand, and then opining afterwards, rather than waiting for the clarification you suggest you're seeking. When someone responds to a no-brainer suggestion, grounded in skepticism but perfectly sensible nevertheless, with a strawman seemingly crafted to discredit it, one has to wonder if that someone is writing in good faith. Are you?
For anyone who is reading in good faith: we're clearly not talking about one hypothetical death, since more than one real death involving driverless car technology has already occurred, and there is no doubt there will be more in the future given the nature of conducting a several-ton hunk of metal across public roads at speed.
It should go without saying that hypothetical auto wreck fatalities occurring prior to the deployment of technology are not the fault of everyone who delayed the deployment of that technology, meaning in particular that these hypothetical deaths do not justify hastening deployment. This is a false conflation regardless of how many times Marc Andreesen and his apostles preach variations of it.
Finally "ban", or any other policy prescription for that matter, appeared nowhere in my post. That's the invention of this strawman's author (you can judge for yourself what the purpose of such an invention might be). What I urge is honestly attending to the serious and deadly important moral and justice questions surrounding the deployment of this class of technology before it is fully unleashed on the world, not after. Unless one is so full up with the holy fervor of technoutopianism that one's rationality has taken leave, this should read as an anodyne and reasonable suggestion.
-
I hate felon musk but I honestly believe their self driving tech is safer than humans.
Have you seen the average human? They're beyond dumb. If they're in cars it's like the majority of htem are just staring at their cell phones.
I don't think self driving tech works in all circumstances, but I bet it is already much better than humans at most driving, especially highway driving.
I honestly believe their self driving tech is safer than humans.
That's how it should be. Unfortunately, one of the main decision maker on tesla's self driving software is doing their best to make it perform worse and worse every time it gets an update.
-
This post did not contain any content.
Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding "confusing" crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can't handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.I'm not blaming Waymo for doing it as safe as they can, that's great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.What's really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
-
I hate felon musk but I honestly believe their self driving tech is safer than humans.
Have you seen the average human? They're beyond dumb. If they're in cars it's like the majority of htem are just staring at their cell phones.
I don't think self driving tech works in all circumstances, but I bet it is already much better than humans at most driving, especially highway driving.
Human drivers have an extremely long tail of idiocy. Most people are good (or at least appropriately cautious) drivers, but there is a very small percentage of people who are extremely aggressive and reckless. The fact that self driving tech is never emotional, reckless or impaired pretty much guarantees that it will always statistically beat humans, even in somewhat basic forms.
-
ROFL ya because taking 5 separate buses to get to work is TOTALLY going to encourage people to get rid of their cars.
Fucking brilliant.
Oh ya and I TOTALLY want to give up my car just so I can be forced to sit next to rude assholes coughing in my face.
These brilliant suggestions are amazing.
You do realize that if we invest in more mass transit, then the people who want to take the bus will. That means fewer cars on the road and less traffic that you have to deal with. If you like driving your car and the freedom it gives you, advocating for more mass transit is in your favor. Imagine your commute with 90% less traffic. Doesn't that sound appealing to you? Dedicated bus lanes that keep the slow busses out of your lane, doesn't that sound appealing to you? I don't know about you, but I love driving, and to me, that sounds like an absolute win
-
I hate felon musk but I honestly believe their self driving tech is safer than humans.
Have you seen the average human? They're beyond dumb. If they're in cars it's like the majority of htem are just staring at their cell phones.
I don't think self driving tech works in all circumstances, but I bet it is already much better than humans at most driving, especially highway driving.
I think the fair comparison would be humans that drive legally.
Idiots that drive high or drunk or without prescription glasses or whatever, shouldn't count as "normal" human driving.
In the same way a self driving car can have issues that will make it illegal.The problem is that legal self driving Tesla is not as safe as a legal person. I sees poorly at night, it gets confused in situations people handle routinely. And Tesla is infamous for not stopping when the road is blocked from 1m and up, and for breaking without reason. I've seen videos where they demonstrated an unnecessary break every ½ hour!! Where a large part was the German Autobahn, which is probably some of the easiest driving in the world!!
-
Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding "confusing" crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can't handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.I'm not blaming Waymo for doing it as safe as they can, that's great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.What's really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
You’re not wrong, but arguably that doesn’t invalidate the point, they do drive better than humans because they’re so much better at judging their own limitations.
If human drivers refused to enter dangerous intersections, stopped every time things started yup look dangerous, and handed off to a specialist to handle problems, driving might not produce the mountain of corpses it does today.
That said, you’re of course correct that they still have a long way to go in technical driving ability and handling of adverse conditions, but it’s interesting to consider that simple policy effectively enforced is enough to cancel out all the advantages that human drivers currently still have.
-
This post did not contain any content.
*human drivers remotely controlling cars crash less than humans directly controlling cars
-
Unprofessional human drivers (yes, even you) are unbelievably bad at driving, it's only a matter of time, but call me when you can do it without just moving labor done by decently paid locals to labor done remotely in the third world.
Are you talking about remote controlling cars from India or something?
That last sentence makes very little sense to me.How is that relevant? I'm pretty sure the latency would be too high, so it wouldn't even work.
Ah OK you are talking about the navigators, that "help" the car when it can't figure out what to do.
That's a fair point.But still 1 navigator can probably handle many cars. So from the perspective of making a self driving taxi, it makes sense.
-
"After 6 miles, Teslas crash a lot more than human drivers."
-
Unprofessional human drivers (yes, even you) are unbelievably bad at driving, it's only a matter of time, but call me when you can do it without just moving labor done by decently paid locals to labor done remotely in the third world.
Thanks, but I am not, others on the road however, abysmal.
-
You’re not wrong, but arguably that doesn’t invalidate the point, they do drive better than humans because they’re so much better at judging their own limitations.
If human drivers refused to enter dangerous intersections, stopped every time things started yup look dangerous, and handed off to a specialist to handle problems, driving might not produce the mountain of corpses it does today.
That said, you’re of course correct that they still have a long way to go in technical driving ability and handling of adverse conditions, but it’s interesting to consider that simple policy effectively enforced is enough to cancel out all the advantages that human drivers currently still have.
You are completely ignoring the under ideal circumstances part.
They can't drive at night AFAIK, they can't drive outside the area that is meticulously mapped out.
And even then, they often require human intervention.If you asked a professional driver to do the exact same thing, I'm pretty sure that driver would have way better accident record than average humans too.
Seems to me you are missing the point I tried to make. And is drawing a false conclusion based on comparing apples to oranges.
-
You’re not wrong, but arguably that doesn’t invalidate the point, they do drive better than humans because they’re so much better at judging their own limitations.
If human drivers refused to enter dangerous intersections, stopped every time things started yup look dangerous, and handed off to a specialist to handle problems, driving might not produce the mountain of corpses it does today.
That said, you’re of course correct that they still have a long way to go in technical driving ability and handling of adverse conditions, but it’s interesting to consider that simple policy effectively enforced is enough to cancel out all the advantages that human drivers currently still have.
driving might not produce the mountain of corpses it does today.
And people wouldn't be able to drive anywhere. Which could very well be a good thing, but still
-
They're not saying general road safety is 20x better. They're comparing an automated car ONLY on surface streets with lights, intersections, pedestrians, dogs, left turns, etc... to a professional truck driver mostly on highway miles.
That's fair. Comparing regular drivers doing typical city trips to commercial big rigs is a bit apples-and-oranges. I wonder how CDL data would compare when the self-driving semi-trucks start putting on miles. Aurora is about to launch in that exact space.
-
If you build it they will come
Because having a bus to pick up 7 people in a day is really efficient economically and environmentally...
-
@[email protected]
So let me make sure I understand your argument. Because nobody can be held liable for one hypothetical death of a child when an accident happens with a self driving car, we should ban them so that hundreds of real children can be killed instead. Is that what you are saying?
No, this strawman is obviously not my argument. It's curious you're asking whether you understand, and then opining afterwards, rather than waiting for the clarification you suggest you're seeking. When someone responds to a no-brainer suggestion, grounded in skepticism but perfectly sensible nevertheless, with a strawman seemingly crafted to discredit it, one has to wonder if that someone is writing in good faith. Are you?
For anyone who is reading in good faith: we're clearly not talking about one hypothetical death, since more than one real death involving driverless car technology has already occurred, and there is no doubt there will be more in the future given the nature of conducting a several-ton hunk of metal across public roads at speed.
It should go without saying that hypothetical auto wreck fatalities occurring prior to the deployment of technology are not the fault of everyone who delayed the deployment of that technology, meaning in particular that these hypothetical deaths do not justify hastening deployment. This is a false conflation regardless of how many times Marc Andreesen and his apostles preach variations of it.
Finally "ban", or any other policy prescription for that matter, appeared nowhere in my post. That's the invention of this strawman's author (you can judge for yourself what the purpose of such an invention might be). What I urge is honestly attending to the serious and deadly important moral and justice questions surrounding the deployment of this class of technology before it is fully unleashed on the world, not after. Unless one is so full up with the holy fervor of technoutopianism that one's rationality has taken leave, this should read as an anodyne and reasonable suggestion.I was asking in good faith because the way you talk is not easily comprehensible. I can barely follow whatever argument you are trying to make. I think you are trying to say that we shouldn't allow them on the road until we have fully decided who is at fault in an accident?
Also, only one death has occurred so far involving driverless cars, which is where a speeding SUV rammed into a stopped driverless car and then the SUV continued on and hit 5 other cars where it killed someone. That's it. The only death involved a driverless car sitting still, not moving, not doing anything... and it wasn't even the car that hit the car in which the person died. So I would say it is hypothetical when talking about hypothetical deaths that are the fault of a driverless car.