• tiramichu@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    3
    ·
    edit-2
    5 days ago

    I think the reason non-tech people find this so difficult to comprehend is the poor understanding of what problems are easy for (classically programmed) computers to solve versus ones that are hard.

    if ( person_at_crossing ) then { stop }
    

    To the layperson it makes sense that self-driving cars should be programmed this way. Aftter all, this is a trivial problem for a human to solve. Just look, and if there is a person you stop. Easy peasy.

    But for a computer, how do you know? What is a ‘person’? What is a ‘crossing’? How do we know if the person is ‘at/on’ the crossing as opposed to simply near it or passing by?

    To me it’s this disconnect between the common understanding of computer capability and the reality that causes the misconception.

    • Starbuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 days ago

      I think you could liken it to training a young driver who doesn’t share a language with you. You can demonstrate the behavior you want once or twice, but unless all of the observations demonstrate the behavior you want, you can’t say “yes, we specifically told it to do that”

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      5 days ago

      But for a computer, how do you know? What is a ‘person’? What is a ‘crossing’? How do we know if the person is ‘at/on’ the crossing as opposed to simply near it or passing by?

      Most walkways are marked. The vehicle is able to identify obstructions in the road and things on the side of the road that are moving towards the road just like cross street traffic.

      If (thing) is crossing the street then stop. If (thing) is stationary near a marked crosswalk, stop and go if they don’t move in (x) seconds. If they don’t move in a reasonable amount of time, then go.

      You know, the same way people are supposed to handle the same situation.

      • hissing meerkat@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        ·
        5 days ago

        Most crosswalks in the US are not marked, and in all places I’m familiar with vehicles must stop or yield to pedestrians at unmarked crosswalks.

        At unmarked crosswalks and marked but uncontrolled crosswalks we have to handle the situation with social cues about which direction the pedestrian wants to cross the street/road/highway and if they will feel safer crossing the road after a vehicle has passed than before (almost always for homeless pedestrians and frequently for pedestrians in moderate traffic).

        If waymo can’t figure out if something intends or is likely to enter the highway they can’t drive a car. Those can be people at crosswalks, people crossing at places other than crosswalks, blind pedestrians crossing anywhere, deaf and blind pedestrians crossing even at controlled intersections, kids or wildlife or livestock running toward the road, etc.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          5 days ago

          Person, dog, cat, rolling cart, bicycle, etc.

          If the car is smart enough to recognize a stationary atop sign then it should be able to ignore a permantly mounted crosswalk sign or indicator light at a crosswalk and exclude those from things that might move into the street. Or it could just stop and wait a couple seconds if it isn’t sure.

          • Dragon Rider (drag)@lemmy.nz
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            5 days ago

            A woman was killed by a self driving car because she walked her bicycle across the road. The car hadn’t been programmed to understand what a person walking a bicycle is. Its AI switched between classifying her as a pedestrian, cyclist, and “unknown”. It couldn’t tell whether to slow down, and then it hit her. The engineers forgot to add a category, and someone died.

            • snooggums@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              5 days ago

              It shouldn’t even matter what category things are when they are on the road. If anything larger than gravel is in the road the car should stop.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      You can use that logic to say it would be difficult to do the right thing for all cases, but we can start with the ideal case.

      • For a clearly marked crosswalk with a pedestrian in the street, stop
      • For a pedestrian in the street, stop.
    • Iceblade@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      Difference is that humans (usually) come with empathy (or at least self-preservation) built in. With self-driving cars we aren’t building in empathy and self (or at least passenger) preservation, we’re hard-coding in scenarios where the law says they have to do X or Y.