Last July, San Jose issued an open invitation to technology companies to mount cameras on a municipal vehicle that began periodically driving through the city’s district 10 in December, collecting footage of the streets and public spaces. The images are fed into computer vision software and used to train the companies’ algorithms to detect the unwanted objects, according to interviews and documents the Guardian obtained through public records requests.

  • @Anyolduser@lemmynsfw.com
    link
    fedilink
    English
    119 months ago

    In terms of legal precedent this may be a good thing in the long run.

    The software billed as “AI” these days is half baked. If one or more law enforcement agencies point to the new piece of software the city deployed as their probable cause to make an arrest it won’t take long for that to get challenged in court.

    This sets the stage for the legality of the software to be challenged now (in half baked form) and to set a legal standard demanding high accuracy and/or human assessment when making an arrest.

        • Justin
          link
          fedilink
          English
          19 months ago

          Yes, but the EU is setting legal precident here that American legislation should follow.

    • TurtleJoe
      link
      fedilink
      English
      29 months ago

      I think you’re more optimistic than I am about a conservative appeals court judge being able to first understand that the technology works very well, then actually give a shit if they do.

      • @Anyolduser@lemmynsfw.com
        link
        fedilink
        English
        39 months ago

        I’m arguing against the technology. I believe that the decision to make an arrest should fall to a human being and that individual should be allowed to override a bad call by the shit being billed as AI.

        There’s a real possibility that law enforcement agencies may try to foist responsibility for decisions onto software and require officers to abide by the recommendations of said software. That would be a huge mistake.