- cross-posted to:
- energia
- cross-posted to:
- energia
Wapo journalist verifies that robotaxis fail to stop for pedestrians in marked crosswalk 7 out of 10 times. Waymo admitted that it follows “social norms” rather than laws.
The reason is likely to compete with Uber, 🤦
Wapo article: https://www.washingtonpost.com/technology/2024/12/30/waymo-pedestrians-robotaxi-crosswalks/
Cross-posted from: https://mastodon.uno/users/rivoluzioneurbanamobilita/statuses/113746178244368036
I think the reason non-tech people find this so difficult to comprehend is the poor understanding of what problems are easy for (classically programmed) computers to solve versus ones that are hard.
if ( person_at_crossing ) then { stop }
To the layperson it makes sense that self-driving cars should be programmed this way. Aftter all, this is a trivial problem for a human to solve. Just look, and if there is a person you stop. Easy peasy.
But for a computer, how do you know? What is a ‘person’? What is a ‘crossing’? How do we know if the person is ‘at/on’ the crossing as opposed to simply near it or passing by?
To me it’s this disconnect between the common understanding of computer capability and the reality that causes the misconception.
I think you could liken it to training a young driver who doesn’t share a language with you. You can demonstrate the behavior you want once or twice, but unless all of the observations demonstrate the behavior you want, you can’t say “yes, we specifically told it to do that”
Most walkways are marked. The vehicle is able to identify obstructions in the road and things on the side of the road that are moving towards the road just like cross street traffic.
If (thing) is crossing the street then stop. If (thing) is stationary near a marked crosswalk, stop and go if they don’t move in (x) seconds. If they don’t move in a reasonable amount of time, then go.
You know, the same way people are supposed to handle the same situation.
Most crosswalks in the US are not marked, and in all places I’m familiar with vehicles must stop or yield to pedestrians at unmarked crosswalks.
At unmarked crosswalks and marked but uncontrolled crosswalks we have to handle the situation with social cues about which direction the pedestrian wants to cross the street/road/highway and if they will feel safer crossing the road after a vehicle has passed than before (almost always for homeless pedestrians and frequently for pedestrians in moderate traffic).
If waymo can’t figure out if something intends or is likely to enter the highway they can’t drive a car. Those can be people at crosswalks, people crossing at places other than crosswalks, blind pedestrians crossing anywhere, deaf and blind pedestrians crossing even at controlled intersections, kids or wildlife or livestock running toward the road, etc.
Thing? Like a garbage bin? Or a sign?
Person, dog, cat, rolling cart, bicycle, etc.
If the car is smart enough to recognize a stationary atop sign then it should be able to ignore a permantly mounted crosswalk sign or indicator light at a crosswalk and exclude those from things that might move into the street. Or it could just stop and wait a couple seconds if it isn’t sure.
A woman was killed by a self driving car because she walked her bicycle across the road. The car hadn’t been programmed to understand what a person walking a bicycle is. Its AI switched between classifying her as a pedestrian, cyclist, and “unknown”. It couldn’t tell whether to slow down, and then it hit her. The engineers forgot to add a category, and someone died.
It shouldn’t even matter what category things are when they are on the road. If anything larger than gravel is in the road the car should stop.
You can use that logic to say it would be difficult to do the right thing for all cases, but we can start with the ideal case.
Difference is that humans (usually) come with empathy (or at least self-preservation) built in. With self-driving cars we aren’t building in empathy and self (or at least passenger) preservation, we’re hard-coding in scenarios where the law says they have to do X or Y.
deleted by creator