In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • @Soleos@lemmy.world
    link
    fedilink
    English
    52 months ago

    The bar set for self-driving cars: Can it recognize and respond correctly to a deliberate optical illusion?

    The bar set for humans: https://youtu.be/ks11nuGGupI

    For the record, I do want the bar for self-driving safety to be high. I also want human drivers to be better… Because even not-entirely-safe self-driving cars may still be safer than humans at a certain point.

    Also, fuck Tesla.

    • @legion02@lemmy.world
      link
      fedilink
      English
      62 months ago

      I mean it also plowed through a kid because it was foggy, then rainy. The wall was just one of the tests the tesla failed.

      • @Fermion@feddit.nl
        link
        fedilink
        English
        1
        edit-2
        2 months ago

        Right, those were the failures that really matter, and Rober included the looney tunes wall to get people sharing and talking about it. A scene painted on wall is a contrived edge case, but pedestrians/obstacles in weather involving precipitation is common.

        • Possibly linux
          link
          fedilink
          English
          12 months ago

          I think it does highlight the issue with the real world. There will always be edge cases and situations that lead to odd visuals.