After 50 million miles, Waymos crash a lot less than human drivers
After 50 million miles, Waymos crash a lot less than human drivers
Waymo has been in dozens of crashes. Most were not Waymo’s fault.
After 50 million miles, Waymos crash a lot less than human drivers
Waymo has been in dozens of crashes. Most were not Waymo’s fault.
That's what happens when you have a reasonable sensor suite with LIDAR, instead of trying to rely entirely on cameras like Tesla does.
At least the repair for a camera-only front is cheaper after the car crashes into a parked white bus
And are limited to highly trained routes. There's a reason you only see them in specific neighborhoods of specific cities.
They're super conservative. I rode just once in one. There was a parked ambulance down a side street about 30 feet with it's lights one while paramedics helped someone. The car wouldn't drive forward through the intersection. It just detected the lights and froze. I had to get out and walk. If we all drove that conservatively we'd also have less accidents and congest the city to undrivability.
Back in February, I took a Waymo for the first time and was at first amazed. But then in the middle of an empty four lane road, it abruptly slammed the brakes, twice. There was literally nothing in the road, no cars and because it was raining, no pedestrians within sight.
If I had been holding a drink, it would have spelled disaster.
After the second abrupt stop, I was bracing for more for the remainder of the ride, even though the car generally goes quite slow most of the time. It also made a strange habit of drifting between lanes through intersections and using the turning indicators like it had no idea what it was doing—it kept alternating went from left to right.
Honestly it felt like being in the car with a first time driver.
Maybe the reason they crash less is because everyone around them have to be extremely careful with these cars. Just like in my country we put a big L on the rear of the car for first year drivers.
How long ago was that? Last year I took a couple near Phoenix and they did great, lights or no. The hardest part was dropping me off at the front of a hotel, as people were in and out and cars were everywhere. Still didn't have issues, just slowed down to 3mph when it had 15 years left or so
just slowed down to 3mph when it had 15 years left or so
Damn, spending 15 years in a car going 3mph sounds terrible.
Why are we still doing this? Just fucking invest in mass transit like metro, buses and metrobuses. Jesus
Also, Note that this is based on waymo's own assumptions, that's like believing a 5070 gives you 4090 performance...
That doesn't solve the last mile problem, or transport for all the people who live outside of a few dense cities.
Almost all people can walk a mile. The remainder have special mobility needs.
Yes it does, if done properly. I have stops for four bus lines within walking distance. During peak hours, buses come once every 15 minutes. Trolleys in the city centre, every 10 minutes. Trams, every two minutes, and always packed. Most of the surrounding villages have bus stops. A lack of perspective is not an excuse.
Frankly the best solution i have seen is always a combination of things. At least in the city I live in, people can take bikes on buses and trains, many people walk, and for trips that require trunk space (e.g furniture, DIY supplies etc) there is a Car sharing service that is cheaper than owning a car, or using ride share / taxi.
I don't think waymo is a better option than a combination of what's above, I think it can perhaps compliment it but it should not be the sole last-kilometre solution.
I would like to see waymo-like tech provide better public transit for the disabled. As of now, people in my city with disabilities can book special routes which are serviced by specialized buses/ taxis, and existing lines are all wheelchair accessible as well.
Self driving cars give the opportunity for those people to have even more freedom in booking, since as of now they can't do last minute booking for the custom routes. It wouldn't really create a traffic problem and massively would increase quality of life for those who are sadly disadvantages in society
Why are we still doing this?
Because there's a lot of money in it. 10.3% of the US workforce works in transportation and warehousing. Trucking alone is the #4 spot in that sector (1.2 million jobs in heavy trucks and trailers). Couriers and delivery also ranks highly.
The self-driving vehicles are targeting whole markets and the value of the industry is hard to underestimate. And yes, even transit is being targeted (and being implemented; see South Korea's A21 line). There's a lot of crossover with trucking and buses, not to mention that 42% of transit drivers are 55+ in age. Hiring for metro drivers is insanely hard right now.
Taking waymo's numbers at face value they are almost 20x more dangerous than a professional truck driver in the EU. This is a personal convenience thing for wealthy people, that's it. Fucking over jarvis and Mahmood so we can have fleets of automated ubers...
people in america don't want to ride with public transport because they're incredibly isolationistic and have a fear of other human beings; so they prefer to drive within "their own 4 walls", in their own chassis. It's really about psychology much more than practical feasibility.
So we can have autonomous metros, buses and taxis that allow people anywhere when they need it so they don't rely on having a car?
There's already an autonomous metro.
Why sell $2 light rail fares when you can sell $40 Waymo fares? Now you’re thinking with capitalism!
Because they are driving under near ideal conditions, in areas that are completely mapped out, and guided away from roadworks and avoiding "confusing" crosses, and other traffic situations like unmarked roads, that humans deal with routinely without problem.
And in a situation they can't handle, they just stop and call and wait for a human driver to get them going again, disregarding if they are blocking traffic.
I'm not blaming Waymo for doing it as safe as they can, that's great IMO.
But don̈́t make it sound like they drive better than humans yet. There is still some ways to go.
What's really obnoxious is that Elon Musk claimed this would be 100% ready by 2017. Full self driving, across America, day and night, safer than a human. I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
You’re not wrong, but arguably that doesn’t invalidate the point, they do drive better than humans because they’re so much better at judging their own limitations.
If human drivers refused to enter dangerous intersections, stopped every time things started yup look dangerous, and handed off to a specialist to handle problems, driving might not produce the mountain of corpses it does today.
That said, you’re of course correct that they still have a long way to go in technical driving ability and handling of adverse conditions, but it’s interesting to consider that simple policy effectively enforced is enough to cancel out all the advantages that human drivers currently still have.
You are completely ignoring the under ideal circumstances part.
They can't drive at night AFAIK, they can't drive outside the area that is meticulously mapped out.
And even then, they often require human intervention.
If you asked a professional driver to do the exact same thing, I'm pretty sure that driver would have way better accident record than average humans too.
Seems to me you are missing the point I tried to make. And is drawing a false conclusion based on comparing apples to oranges.
driving might not produce the mountain of corpses it does today.
And people wouldn't be able to drive anywhere. Which could very well be a good thing, but still
I think "near ideal conditions" is a huge exaggeration. The situations Waymo avoids are a small fraction of the total mileage driven by Waymo vehicles or the humans they're being compared with. It's like you're saying a football team's stats are grossly wrong if they don't include punt returns.
I have zero expectation that Tesla RoboTaxi will arrive this summer as promised.
RoboTaxis will also have to "navigate" the Fashla hate. Not many will be eager to risk their lives with them
This would be more impressive if Waymos were fully self-driving. They aren't. They depend on remote "navigators" to make many of their most critical decisions. Those "navigators" may or may not be directly controlling the car, but things do not work without them.
When we have automated cars that do not actually rely on human being we will have something to talk about.
It's also worth noting that the human "navigators" are almost always poorly paid workers in third-world countries. The system will only scale if there are enough desperate poor people. Otherwise it quickly become too expensive.
@CuriousCanid @vegeta this is the case for the Amazon "just walk out" shops as well. Like Waymo they frame it as the humans "just doing the hard part" but who knows what "annotating" means in this context? And notably it's clearly more expensive to run than they thought as they've decided to do Dash Carts instead which looks like it's basically a portable self-service checkout. The customer does the checking. https://www.theverge.com/2024/4/17/24133029/amazon-just-walk-out-cashierless-ai-india
Back when I was a fabricator I made some of the critical components used in Amazon stores. Amazon was incredibly particular about every little detail, even on parts that didn't call for tight tolerancing in any conceivable way. They, on several occasions, sent us one bad set of prints after another. Which we could only discover after completing a run of parts. We're talking 20-30 thousand units that ended up being scrapped because of their shitty prints. Millions of dollars set on fire, basically.
They became such a huge pain in the ass to work with we eliminated every single SKU they ordered from us.
Yeah we managed to just put the slave workers behind a further layer of obfuscation. Not just relegated to their own quarters or part of town but to a different city altogether or even continent.
Tech dreams have become about a complete lack of humanity.
I saw an article recently, I should remember where, about how modern "tech" seems to be focused on how to insert a profit-taking element between two existing components of a system that already works just fine without it.
The system will only scale if there are enough desperate poor people. Otherwise it quickly become too expensive.
You can also get MMORPG players to do it for pennies per hour for in-game currency or membership. RuneScape players would gladly control 5 'autonomous' cars if it meant that they could level up their farming level for free.
The game is basically designed to be an incredibly time consuming skinner box that takes minimal skill and effort in order to maximize membership fees.
"Damn, I'm sorry my car killed your kids. The Carscape person didn't get their drop"
Packaging the job as a video game side quest is genius. Make so the gamer has to do several simulated runs before they connect to an actual car, and give in-game expensive consequences for messing it up
Could a navigator run you over twice from different companies after they get fired from the first one?
Sequel to snowcrash right there
If they have to do it a second time, they aren't very good at it.
God, I hope so.
I thought the human operators only step in when the emergency button is pressed or when the car gets stuck?
Do they actually get driven by people in normal operation?
The claim is that the remote operators do not actually drive the cars. However, they do routinely "assist" the system, not just step in when there's an emergency.
Has anyone found the places where the navigators work to see how it goes? Has a navigator shared their experience on the web somewhere?
I am very curious as to what they are asked to do and for how many cars And for how much money
i knew it that AI is just some guy in india responding to my queries.
AI - Actually Indian
Considering the sort of driving issues and code violations I see on a daily basis, the standards for human drivers need raising. The issue is more lax humans than it is amazing robots.
it's hard to change humans. It's easy to roll out a firmware update.
Raising the standards would result in 20-50% of the worst drivers being forced to do something else. If our infrastructure wasn't so car-centric, that would be perfectly fine.
:Looks at entire midwest and southern usa:
The bar is so low in these regions you need diamond drilling bits to go lower.
What's a zipper merge?
Screams in Midwestern
"You don't have to be faster than the bear, you just have to be faster than the other guy"
I am once again begging journalists to be more critical of tech companies.
But as this happens, it’s crucial to keep the denominator in mind. Since 2020, Waymo has reported roughly 60 crashes serious enough to trigger an airbag or cause an injury. But those crashes occurred over more than 50 million miles of driverless operations. If you randomly selected 50 million miles of human driving—that’s roughly 70 lifetimes behind the wheel—you would likely see far more serious crashes than Waymo has experienced to date.
[...] Waymo knows exactly how many times its vehicles have crashed. What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash. Waymo has tried to address this by estimating human crash rates in its two biggest markets—Phoenix and San Francisco. Waymo’s analysis focused on the 44 million miles Waymo had driven in these cities through December, ignoring its smaller operations in Los Angeles and Austin.
This is the wrong comparison. These are taxis, which means they're driving taxi miles. They should be compared to taxis, not normal people who drive almost exclusively during their commutes (which is probably the most dangerous time to drive since it's precisely when they're all driving).
We also need to know how often Waymo intervenes in the supposedly autonomous operations. The latest we have from this, which was leaked a while back, is that Cruise (different company) cars are actually less autonomous than taxis, and require >1 employee per car.
edit: The leaked data on human interventions was from Cruise, not Waymo. I'm open to self-driving cars being safer than humans, but I don't believe a fucking word from tech companies until there's been an independent audit with full access to their facilities and data. So long as we rely on Waymo's own publishing without knowing how the sausage is made, they can spin their data however they want.
edit2: Updated to say that ournalists should be more critical in general, not just about tech companies.
Journalist aren't even critical of police press releases anymore, most simply print whatever they're told verbatim. It may as well just be advertisement.
I agree with you so strongly that I went ahead and updated my comment. The problem is general and out of control. Orwell said it best: "Journalism is printing something that someone does not want printed. Everything else is public relations."
The meat of the true issue right here. Journalism and investigative journalism aren't just dead, their corpses has been feeding a palm tree like a pod of beached whales for decades. It's a bizarre state of affairs to read news coverage and come out the other side less informed, without reading literal disinformation. It somehow seems so much worse that they're not just off-target, but that they don't even understand why or how they're fucking it up.
I was going to say they should only be comparing them under the same driving areas, since I know they aren't allowed in many areas.
But you're right, it's even tighter than that.
These articles frustrate the shit out of me. They accept both the company's own framing and its selectively-released data at face value. If you get to pick your own framing and selectively release the data that suits you, you can justify anything.
@theluddite@lemmy.ml @vegeta@lemmy.world
to amplify the previous point, taps the sign as Joseph Weizenbaum turns over in his grave
A computer can never be held accountable
Therefore a computer must never make a management decision
tl;dr A driverless car cannot possibly be "better" at driving than a human driver. The comparison is a category error and therefore nonsensical; it's also a distraction from important questions of morality and justice. More below.
Numerically, it may some day be the case that driverless cars have fewer wrecks than cars driven by people.(1) Even so, it will never be the case that when a driverless car hits and kills a child the moral situation will be the same as when a human driver hits and kills a child. In the former case the liability for the death would be absorbed into a vast system of amoral actors with no individuals standing out as responsible. In effect we'd amortize and therefore minimize death with such a structure, making it sociopathic by nature and thereby adding another dimension of injustice to every community where it's deployed.(2) Obviously we've continually done exactly this kind of thing since the rise of modern technological life, but it's been sociopathic every time and we all suffer for it despite rampant narratives about "progress" etc.
It will also never be the case that a driverless car can exercise the judgment humans have to decide whether one risk is more acceptable than another, and then be held to account for the consequences of their choice. This matters.
Please (re-re-)read Weizenbaum's book if you don't understand why I can state these things with such unqualified confidence.
Basically, we all know damn well that whenever driverless cars show some kind of numerical superiority to human drivers (3) and become widespread, every time one kills, let alone injures, a person no one will be held to account for it. Companies are angling to indemnify themselves from such liability, and even if they accept some of it no one is going to prison on a manslaughter charge if a driverless car kills a person. At that point it's much more likely to be treated as an unavoidable act of nature no matter how hard the victim's loved ones reject that framing. How high a body count do our capitalist systems need to register before we all internalize this basic fact of how they operate and stop apologizing for it?
(1) Pop quiz! Which seedy robber baron has been loudly claiming for decades now that full self driving is only a few years away, and depends on people believing in that fantasy for at least part of his fortune? We should all read Wrong Way by Joanne McNeil to see the more likely trajectory of "driverless" or "self-driving" cars.
(2) Knowing this, it is irresponsible to put these vehicles on the road, or for people with decision-making power to allow them on the road, until this new form of risk is understood and accepted by the community. Otherwise you're forcing a community to suffer a new form of risk without consent and without even a mitigation plan, let alone a plan to compensate or otherwise make them whole for their new form of loss.
(3) Incidentally, quantifying aspects of life and then using the numbers, instead of human judgement, to make decisions was a favorite mission of eugenicists, who stridently pushed statistics as the "right" way to reason to further their eugenic causes. Long before Zuckerberg's hot or not experiment turned into Facebook, eugenicist Francis Galton was creeping around the neighborhoods of London with a clicker hidden in his pocket counting the "attractive" women in each, to identify "good" and "bad" breeding and inform decisions about who was "deserving" of a good life and who was not. Old habits die hard.
Honestly I should just get that slide tattooed to my forehead next to a QR code to Weizenbaum's book. It'd save me a lot of talking!
So let me make sure I understand your argument. Because nobody can be held liable for one hypothetical death of a child when an accident happens with a self driving car, we should ban them so that hundreds of real children can be killed instead. Is that what you are saying?
As far as I know of, Waymo has only been involved in one fatality. The Waymo was sitting still at a red light in traffic when a speeding SUV (according to reports going at extreme rate of speed) rammed it from behind into other cars. The SUV then continued into traffic where it struck more cars, eventually killing someone. That's the only fatal accident Waymo has been involved in after 50 million miles of driving. But instead of making it safer for children, you would prefer more kids die just so you have someone to blame?
We always knew good quality self-driving tech would vastly outperform human skill. It's nice to see some decent metrics!
My drive to work is 8 minutes. This morning i almost had a crash because a guy ran a stop sigh. I don't think the bar is very high at this point.
That's the beauty of it - we've only just begun to improve the situation. It's going to get better and better until eventually traffic accidents are a rarity.
Indeed
But when it does crash, will Google accept the liability?
Probably depends who is at fault. I also would be that Google has insurance for this sort of thing.
They both own and operate the car. Even if it was a manned taxi, they'd be liable.
They consult Gemini. If it gives a cogent answer, they consider it a "yes". So, no.
"After 6 miles, Teslas crash a lot more than human drivers."
So only drive 5 miles. I guess that’s good advice in general
I hate felon musk but I honestly believe their self driving tech is safer than humans.
Have you seen the average human? They're beyond dumb. If they're in cars it's like the majority of htem are just staring at their cell phones.
I don't think self driving tech works in all circumstances, but I bet it is already much better than humans at most driving, especially highway driving.
I think the fair comparison would be humans that drive legally.
Idiots that drive high or drunk or without prescription glasses or whatever, shouldn't count as "normal" human driving.
In the same way a self driving car can have issues that will make it illegal.
The problem is that legal self driving Tesla is not as safe as a legal person. I sees poorly at night, it gets confused in situations people handle routinely. And Tesla is infamous for not stopping when the road is blocked from 1m and up, and for breaking without reason. I've seen videos where they demonstrated an unnecessary break every ½ hour!! Where a large part was the German Autobahn, which is probably some of the easiest driving in the world!!
Bro I saw a video of their car drive through a wall and hand the controls back to the driver. No, it absolutely is not.
Human drivers have an extremely long tail of idiocy. Most people are good (or at least appropriately cautious) drivers, but there is a very small percentage of people who are extremely aggressive and reckless. The fact that self driving tech is never emotional, reckless or impaired pretty much guarantees that it will always statistically beat humans, even in somewhat basic forms.
I honestly believe their self driving tech is safer than humans.
That's how it should be. Unfortunately, one of the main decision maker on tesla's self driving software is doing their best to make it perform worse and worse every time it gets an update.
Your username is a lie huh?
Unprofessional human drivers (yes, even you) are unbelievably bad at driving, it's only a matter of time, but call me when you can do it without just moving labor done by decently paid locals to labor done remotely in the third world.
Are you talking about remote controlling cars from India or something?
That last sentence makes very little sense to me.
How is that relevant? I'm pretty sure the latency would be too high, so it wouldn't even work.
Ah OK you are talking about the navigators, that "help" the car when it can't figure out what to do.
That's a fair point.
But still 1 navigator can probably handle many cars. So from the perspective of making a self driving taxi, it makes sense.
Thanks, but I am not, others on the road however, abysmal.
I find the scariest people on the road to be the arrogant ones that think they make no mistakes.
I would t consider anyone who hasn't done at least a dozen track days, experienced several different extreme scenarios (over/under steer, looping, wet grass at speed, airtime (or at least one or more wheels off the ground), high speed swerving, snap oversteer, losing systems, like brakes, engine, or the steering wheel lock engaging, etc) to be remotely prepared to handle a car going more than 25 or so mph. An extreme minority of drivers are actually prepared to handle an incoming collision in order to fully mitigate a situation. And that is only covering the mechanical skill of piloting the car, it doesn't even touch in the theoretical and practical knowledge (rules of the road, including obscure and unenforced rules) and it definitely doesn't even broach the discipline that is required to actually put it all together.
If you a driver has never been trained, or even have an understanding of what will happen in an extreme scenario in a car, how could we consider them trained or sufficiently skilled.
We don't let pilots fly without spending time in a simulator, going over emergency scenarios and being prepared for when things go sideways. You can't become an airline pilot if you don't know what happens when you lose power.
We let sub par people drive because restricting it too much would be seen as discrimination, but the overwhelming majority of people are ill equipped to actually drive.
I used to hate them for being slow and annoying. Now they drive like us and I hate them for being dicks. This morning, one of them made an insane move that only the worst Audi drivers in my area do, a massive left over a solid yellow across no stop sign with me coming right at it before it even began acceleration into the intersection.
No shit. The bar is low. Humans suck at driving. People love to throw FUD at automated driving, and it's far from perfect, but the more we delay adoption the more lives are lost. Anti-automation on the roads is up there with anti-vaccine mentality in my mind. Fear and the incorrect assumption that "I'm not the problem, I'm a really good driver," mentality will inevitably delay automation unnecessarily for years.
It'd probably be better to put a lot of the R&D money into improving and reinforcing public transport systems. Taking cars off the road and separating cars from pedestrians makes a bigger difference than automating driving.
In my country at least (US) that's just not going to happen.
WVU has a tram system called the "PRT". It's semi-automated cars on a track around campus and downtown. It's not great, but goddamn does it handle a large school population just fine. Very high throughput, and it keeps congestion down. ... as down as you can be with such a high density town.
That, and the inevitable bureaucratic nightmare that awaits for standardising across makes and updating the infrastructure.
Car infrastructure was a mistake. Automation isn't the solution, it's less cars and car-based spaces.
Why not both? We can automate the trains (more), the busses, and the occasional rural drive.
Automation also can be abused, which I'm very very cautious about.
"Waymo reports that Waymo cars are the best"
"Waymo reports the statistical data it has, which happens to be pretty good."
As a techno-optimist, I always expected self-driving to quickly become safer than human, at least in relatively controlled situations. However I’m at least as much a pessimist of human nature and the legal system.
Given self-driving vehicles demonstrably safer than human, but not perfect, how can we get beyond humans taking advantage, and massive liability for the remaining accidents?
how those robot food delivery "robot ai boxes"? by starship doing?
I had a friend that worked for them in the past. They really aren't that impressive. They get stuck constantly. While the tech down the line might be revolutionary for people who cannot drive for whatever reason right now it still needs a LOT of work.
@MoreFPSmorebetter @vegeta I just can't see this type of tech working in places with a more pedestrian-first culture / more unpredictable human behaviour, i.e. countries without jaywalking laws. If you tried to drive this through London and people realised it will just have to automatically stop for you (and also won't stop for you out of politeness if you wait hopefully) then everyone will just walk in front of it. What's the plan, special "don't stop the Waymo" laws?
Vegas sure has a lot of pedestrians doing a whole lot of unpredictable things.
People in London just walk in front of all cars all the time. Including me. That's not an unpredictable behaviour, that's a default and very predictable behaviour. If you're in a car - you stop.
Obviously we install a padded arm that grabs the pedestrians and throws them back onto the curb so they learn not to just walk out in front of the moving vehicles.
Idk how it is where y'all live but generally people only jaywalk when there aren't cars driving on the road at that moment. Other than crosswalks it's kinda expected that if you are going to jaywalk you are going to do it when no car will have to stop or slow down to avoid you. Obviously not everyone follows that rule but generally speaking.
What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash.
Also, I think it's worth discussing whether to include in the baseline certain driver assistance technologies, like automated braking, blind spot warnings, other warnings/visualizations of surrounding objects, cars, bikes, or pedestrians, etc. Throw in other things like traction control, antilock brakes, etc.
There are ways to make human driving safer without fully automating the driving, so it may not be appropriate to compare fully automated driving with fully manual driving. Hybrid approaches might be safer today, but we don't have the data to actually analyze that, as far as I can tell.
There's a limit to what assist systems can do. Having the car and driver fighting for control actually makes everything far less safe.
Evolution took a billion years too, so it's kinda fair to say "well, vehicles need some training".
That doesn't seem like a very high bar to achieve
How are they with parking lots, tho'?
Or yielding to emergency vehicles.
I think "veritasium" or what the yt channel is called made a video about those.
It did manage to bring him to a store with a big parking lot, it did it.
As snarky as my initial comment may sound (even to me, I have by-proxy distrust of contemporary models due to their knobhead owners), I'm genuinely glad to hear they figured that one out! At least there's less danger for everyone around, at the VERY least.
They work great in parking lots.
Source: Ridden in several Waymos
Genuinely a relief to hear, thank you!
Just fine the one time I rode in one. It had a problem with a moving truck blocking the entire street, where it sat trying to wait to see if the moving truck was just stopped and going to move or if it was parked for good. The Waymo executed a 3 point turn and then had two construction trucks pull into the street the other direction, and they refused to back up. So the Waymo was stuck between not going forward and not going back... it just pulled forward toward the trucks and then reversed toward the moving truck. Back and forth. Then I yelled out the window for the fucking trucks to move out of the fucking road, which they couldn't drive down anyway. After that it was smooth, even getting into the parking lot.
My buddy said at his office the Waymos have an issue with pulling too far forward at the pick up spots, which makes it impossible for cars to go around them, but humans do dumb shit like that, too.
Yyyep, that sounds pretty standard fare (no pun intended), I've lived mostly in abstract neighborhoods in terms of infrastructure and had to chase rides in a grand majority of cases.
Plus, honestly, even the way it handled the construction jam sounds acceptable, reminds me of my first days of learning to drive. As long as they stop and stay stopped, that's way better than deciding to ignore the sensor data and just go for it, like... some other models...
I believe it, but they also only drive specific routes.
Focusing on airbag-deployments and injuries ignores the obvious problem: these things are unbelievably unsafe for pedestrians and bicyclists. I curse SF for allowing AVs and always give them a wide berth because there's no way to know if they see you and they'll often behave erratically and unpredictably in crosswalks. I don't give a shit how often the passengers are injured, I care a lot more how much they disrupt life for all the people who aren't paying Waymo for the privilege.
The question is are they safer than human drivers, not are they safe. Cars exist, are everywhere, and are very unsafe to pedestrians. You won't be able to get rid of cars, so if waymo is really safer we should mandate it on all cars. That is a big if though - drunk drivers are still a large percentage of crashes so is if far to lump sober drivers together with drunks - I don't know the real statistics to figure this out.
always give them a wide berth because there's no way to know if they see you and they'll often behave erratically and unpredictably in crosswalks
All of this applies to dealing with human drivers, too.
So the fact that after 50 million miles of driving there have been no pedestrian or cyclist deaths means they are unbelievably unsafe for pedestrians and cyclists? As far as I can tell, the only accidents involving pedestrians or cyclists AT ALL after 50 million miles is when a Waymo struck a plastic crate that careened into another lane where a scooter ran into it. And yet in your mind they are unbelievably unsafe?
And yet it's still the least efficient mode of transport.
What's more efficient?
In terms of getting to an exact location.
Public transportation only can get you near your target mostly. Not on point like a car, bike etc.
Good transit gets you close enough (as others have said, you don't drive your car down the aisles of the supermarket). That few people have good transit is the problem that needs to be fixed. Sadly few really care - in the US the republicans hate transit, and the democrats only like transit for the union labor is employees - importantly neither cares about getting people places.
Bicycles? ride/ walk to were you need to be? Why do you need to be driven to an exact point? All the space needed for parking is just wasted.
You need to create a specific scenario in order to make cars seem more efficient than alternatives. They cause more accidents, take up more space while carrying fewer people at any given time while also causing more pollution than other modes of transport.
You ever heard of legs? Mass transit gets you the bulk of the way there, and legs will handle the small bit left.
In terms of getting to an exact location, the most efficient is no vehicle, walking.
Cars are less efficient, followed by busses, then probably trains, then boats, then airplanes (unless you parachute).
Cars are the least efficient in terms of moving large numbers of people from places they can then walk from.
If someone can't walk a few blocks, that's on them. Airplanes don't get you exactly to the destination either. There's a tradeoff.
E: For all the "What about the elderly or disabled?" If they can't walk a few blocks and also can't afford a car or taxi/Uber, what should they do? Mobility devices exist. Handicap accessible buildings are federally required. Your argument is merely a thought terminating interruption. That problem can easily be addressed.
Thing is, the end goal after sorting out all the bugs in the AI is no human druven cars since having both will only lead to crashes dur to AI being unable to predict a human. All the AI cars would be linked to a central system to communicate with eachother and alwats know where eachither are. Then all we have to do is make sure people only use the cross walks and traffic accudents will be solely due to idiots.
I doubt a central system would ever be viable, but they would certainly communicate to other nearby cars with more than just blinky lights
I live in Phoenix, Arizona and these are all around. Honestly I feel like the future everyone will have Waymo type services and no one will own cars or even need to learn how to drive one. Who needs to worry about car repairs insurance etc.
I've rode in them a few times, fell asleep even. I trust a Waymo more than most human drivers. Best test of its capabilities I saw was when school let out and the side road was covered in kids and parents and cars in random spots waiting for people. It stayed in the "lane", no lane lines, and calmly navigated forward as people gave it space. I was in the car the whole time. Still there are some issues to be ironed out, but ultimately I don't think I have ever had a bad riding experience.
Makes sense. There's less automated cars than human drivers. Human drivers have also been around way longer.
They accounted for that in this report. I believe you are a troll.
I believe you are a troll.
Then you don't know what trolling actually is.
*human drivers remotely controlling cars crash less than humans directly controlling cars
But it's not like that. There's some kind of ML involved but also like they had to map put their entire service area, etc. If something goes wrong, a human has to come up and drive your driverless car lmao
Most trips require remote intervention by one of their employees at at least some point.
driving regulations and enforcement should just be stricter on humans and self driving can stay as trains separated from cars,bikes, and pedestrians, which should all be separated from each other as well.