After 50 million miles, Waymos crash a lot less than human drivers
After 50 million miles, Waymos crash a lot less than human drivers
Waymo has been in dozens of crashes. Most were not Waymoโs fault.
After 50 million miles, Waymos crash a lot less than human drivers
Waymo has been in dozens of crashes. Most were not Waymoโs fault.
You're viewing a single thread.
Makes sense. There's less automated cars than human drivers. Human drivers have also been around way longer.
They accounted for that in this report. I believe you are a troll.
I believe you are a troll.
Then you don't know what trolling actually is.
Okay, I'm sorry. Let me clarify how it's easy to account for the kind of bias you're talking about. Simply divide by the population count. So, they divided the waymo crash count by the number of waymos, and the human crash count by the number of humans. This gives the waymo crash rate and the human crash rate. (In reality, it's a bit more complicated, since the human crash rate is calculated independently each year.)
Let me clarify further: It was an attempt at humor, and not meant to be taken seriously as you are doing.
Ah. Sorry. There are some truly braindead takes on autonomous vehicles so I couldn't tell that apart from what some people have said earnestly. My bad. ๐
I do think it would be much safer with zero human drivers and only autonomous vehicles on the road, for sure. But I also think it would be impractical to replace everything all at once. Even the best programmed thing would eventually encounter a human driver that defies all previously known data and freaks out the computer.
I don't know anything about how autonomous vehicles work. As far as humans doing unusual things, well assuming the human driver only steers the wheel and controls the gas and breaks, it should be possible with existing technology to avoid crashing into them at least as well as any human can. So that leaves really unusual things, like the human hopping out of their car in the middle of an intersection, as the high-hanging fruit to model. I would imagine for most of these really strange cases, even if the autonomous vehicle can't understand what's happening, they can at least realize that something strange is happening and then pull over.
Obviously there will be truly unusual situations that cause fatal collisions. So long as that is at a lower rate, then what's the safety concern?
Safety is a red herring IMO, as better code can fix it. There are much worse potential problems that autonomous vehicles will cause than rare collisions. NotJustBikes has a lot of points I'd never considered before in the second half of this video. (The first half, though, I found aggravating; it's just about solvable safety risks.)