After 50 million miles, Waymos crash a lot less than human drivers
-
I completely disagree.
For example, you are using the hand brake as an example. 95 percent of people (including you, evidently) don't even understand that the handbrake is not an emergency brake, they don't get how the behavior works, or the fact that it's meant to be used as a parking breaks, I consistently see people slam their parking pawls verytime they get out of their car. (Not to mention that it doesn't even work while you are driving on most modern cars and has no modulation, as it's just a button)
If not being an idiot was good enough to drive a car, then it wouldn't be so deadly. It's also possible to fly a plane with common sense, but you wouldn't be happy if your pilot told you they don't have training.
Driving isn't easy, it's just that we accept an absolutely catastrophic amount of accidents as a cost of doing business.
It is an emergency brake when your brake fails, you donut
-
I do think it would be much safer with zero human drivers and only autonomous vehicles on the road, for sure. But I also think it would be impractical to replace everything all at once. Even the best programmed thing would eventually encounter a human driver that defies all previously known data and freaks out the computer.
I don't know anything about how autonomous vehicles work. As far as humans doing unusual things, well assuming the human driver only steers the wheel and controls the gas and breaks, it should be possible with existing technology to avoid crashing into them at least as well as any human can. So that leaves really unusual things, like the human hopping out of their car in the middle of an intersection, as the high-hanging fruit to model. I would imagine for most of these really strange cases, even if the autonomous vehicle can't understand what's happening, they can at least realize that something strange is happening and then pull over.
Obviously there will be truly unusual situations that cause fatal collisions. So long as that is at a lower rate, then what's the safety concern?
Safety is a red herring IMO, as better code can fix it. There are much worse potential problems that autonomous vehicles will cause than rare collisions. NotJustBikes has a lot of points I'd never considered before in the second half of this video. (The first half, though, I found aggravating; it's just about solvable safety risks.)
-
Much more so than having a car-centric infrastructure. If you start cherry-picking you'll of course find cases where a car would have been more efficient but public transportation needs to be understood as a whole.
I'm not cherry picking Im just telling you my personal experience and the town I live in
-
The way I edited the quote, it was just a like joke about braking vs breaking.
Like I could make a pedantic reply about spelling, but no teslas in fact brake unexpectedly AND break unexpectedly. So, no notes!
Ah OK didn't notice that. English is 2nd language.
-
I think "veritasium" or what the yt channel is called made a video about those.
It did manage to bring him to a store with a big parking lot, it did it.
As snarky as my initial comment may sound (even to me, I have by-proxy distrust of contemporary models due to their knobhead owners), I'm genuinely glad to hear they figured that one out! At least there's less danger for everyone around, at the VERY least.
-
They work great in parking lots.
Source: Ridden in several Waymos
Genuinely a relief to hear, thank you!
-
Just fine the one time I rode in one. It had a problem with a moving truck blocking the entire street, where it sat trying to wait to see if the moving truck was just stopped and going to move or if it was parked for good. The Waymo executed a 3 point turn and then had two construction trucks pull into the street the other direction, and they refused to back up. So the Waymo was stuck between not going forward and not going back... it just pulled forward toward the trucks and then reversed toward the moving truck. Back and forth. Then I yelled out the window for the fucking trucks to move out of the fucking road, which they couldn't drive down anyway. After that it was smooth, even getting into the parking lot.
My buddy said at his office the Waymos have an issue with pulling too far forward at the pick up spots, which makes it impossible for cars to go around them, but humans do dumb shit like that, too.
Yyyep, that sounds pretty standard fare (no pun intended), I've lived mostly in abstract neighborhoods in terms of infrastructure and had to chase rides in a grand majority of cases.
Plus, honestly, even the way it handled the construction jam sounds acceptable, reminds me of my first days of learning to drive. As long as they stop and stay stopped, that's way better than deciding to ignore the sensor data and just go for it, like... some other models...
-
Considering the sort of driving issues and code violations I see on a daily basis, the standards for human drivers need raising. The issue is more lax humans than it is amazing robots.
"You don't have to be faster than the bear, you just have to be faster than the other guy"
-
They're super conservative. I rode just once in one. There was a parked ambulance down a side street about 30 feet with it's lights one while paramedics helped someone. The car wouldn't drive forward through the intersection. It just detected the lights and froze. I had to get out and walk. If we all drove that conservatively we'd also have less accidents and congest the city to undrivability.
Back in February, I took a Waymo for the first time and was at first amazed. But then in the middle of an empty four lane road, it abruptly slammed the brakes, twice. There was literally nothing in the road, no cars and because it was raining, no pedestrians within sight.
If I had been holding a drink, it would have spelled disaster.
After the second abrupt stop I was bracing for more for the remainder of the ride, even though the car generally goes quite slow most of the time. It also made a strange habit of drifting lanes through intersections while the turning indicators went from left to right alternatively like it had no idea what it was doing.
Honestly it felt like being in the car with a first time driver.
-
We always knew good quality self-driving tech would vastly outperform human skill. It's nice to see some decent metrics!
My drive to work is 8 minutes. This morning i almost had a crash because a guy ran a stop sigh. I don't think the bar is very high at this point.
-
There's already an autonomous metro.
Now let's do intercity trains and tramways then
-
Considering the sort of driving issues and code violations I see on a daily basis, the standards for human drivers need raising. The issue is more lax humans than it is amazing robots.
-
Back in February, I took a Waymo for the first time and was at first amazed. But then in the middle of an empty four lane road, it abruptly slammed the brakes, twice. There was literally nothing in the road, no cars and because it was raining, no pedestrians within sight.
If I had been holding a drink, it would have spelled disaster.
After the second abrupt stop I was bracing for more for the remainder of the ride, even though the car generally goes quite slow most of the time. It also made a strange habit of drifting lanes through intersections while the turning indicators went from left to right alternatively like it had no idea what it was doing.
Honestly it felt like being in the car with a first time driver.
Maybe the reason they crash less is because everyone around them have to be extremely careful with these cars. Just like in my country we put a big L on the rear of the car for first year drivers.
-
You are completely ignoring the under ideal circumstances part.
They can't drive at night AFAIK, they can't drive outside the area that is meticulously mapped out.
And even then, they often require human intervention.If you asked a professional driver to do the exact same thing, I'm pretty sure that driver would have way better accident record than average humans too.
Seems to me you are missing the point I tried to make. And is drawing a false conclusion based on comparing apples to oranges.
I specifically didn’t ignore that. My entire point was that a driver that refuses to drive under anything except “ideal circumstances” is still a safer driver.
I am aware that if we banned driving at night to get the same benefit for everyone, it wouldn’t go very well, but that doesn’t really change the safety, only the practicality.
-
driving might not produce the mountain of corpses it does today.
And people wouldn't be able to drive anywhere. Which could very well be a good thing, but still
True enough, it would not be a wise economic or political move
-
This post did not contain any content.
-
Bro I saw a video of their car drive through a wall and hand the controls back to the driver. No, it absolutely is not.
When was the last time you saw a "wall" erected on a freeway that was perfectly painted to mimic the current time of day, road, weather, etc. I'm not talking about for that example, i'm talking about in the real world.
The answer is never
Yes, the optical sensors are fooled by an elaborate ruse that doesn't exist in real world operating conditions on a highway.
-
But it's not like that. There's some kind of ML involved but also like they had to map put their entire service area, etc. If something goes wrong, a human has to come up and drive your driverless car lmao
Most trips require remote intervention by one of their employees at at least some point.
-
:Looks at entire midwest and southern usa:
The bar is so low in these regions you need diamond drilling bits to go lower.
What's a zipper merge?
Screams in Midwestern
-
it's hard to change humans. It's easy to roll out a firmware update.
Raising the standards would result in 20-50% of the worst drivers being forced to do something else. If our infrastructure wasn't so car-centric, that would be perfectly fine.