Sure, porn-trained AI seems a core function.

Porn sites may have blown up Meta’s key defense in a copyright fight with book authors who earlier this year said that Meta torrented “at least 81.7 terabytes of data across multiple shadow libraries” to train its AI models.

Meta has defeated most of the authors’ claims and claimed there is no proof that Meta ever uploaded pirated data through seeding or leeching on the BitTorrent network used to download training data. But authors still have a chance to prove that Meta may have profited off its massive piracy, and a new lawsuit filed by adult sites last week appears to contain evidence that could help authors win their fight, TorrentFreak reported.

The new lawsuit was filed last Friday in a US district court in California by Strike 3 Holdings—which says it attracts “over 25 million monthly visitors” to sites that serve as “ethical sources” for adult videos that “are famous for redefining adult content with Hollywood style and quality.”

After authors revealed Meta’s torrenting, Strike 3 Holdings checked its proprietary BitTorrent-tracking tools designed to detect infringement of its videos and alleged that the company found evidence that Meta has been torrenting and seeding its copyrighted content for years—since at least 2018. Some of the IP addresses were clearly registered to Meta, while others appeared to be “hidden,” and at least one was linked to a Meta employee, the filing said.

  • zero
    link
    fedilink
    153 days ago

    For research purposes, which tracker did Meta use?

  • @pineapple_pizza
    link
    123 days ago

    How funny would it be if that employee was on a different team and was torrenting for personal use and got caught up in this lol.

  • Top notch journalism. Even today, the “legit” sites either have an “I am over 18” button at best, and in general they just block users from states with more stringent requirements. Are we really supposed to hate seeders, just because arstechnica says so?

    • @Midnitte@beehaw.org
      link
      fedilink
      English
      463 days ago

      They’re pointing out the double standard.

      If you seed porn, its a federal offense.

      If Meta does it, its capitalism.

      • @MachineFab812@discuss.tchncs.de
        link
        fedilink
        4
        edit-2
        3 days ago

        While I agree with you to a point, they didn’t stop there or even bother to really make that point at all. They are escalating the seeding of porn to the willful distribution of porn to children. The fact its a corporation doing the seeding just makes for an easy target for such escalation.

    • @t3rmit3@beehaw.org
      link
      fedilink
      11
      edit-2
      3 days ago

      Ars Technica is not asserting that themselves, that’s the argument that Strike3 is making. Strike3 and other porn companies attack non-professional porn on these grounds as well, to try to kill their competition.

  • @Gork@sopuli.xyz
    link
    fedilink
    43 days ago

    So what are they doing with the data? Is this all being fed into the LLM or image generating AI to create ultra realistic porn? To what end? I don’t see their endgame unless it involves sexbots.

    • tiredofsametab
      link
      fedilink
      123 days ago

      Pure speculation: ;possibly to identify sexual nudity and “inappropriate” content as some kind of legitimate usecase. What was actually done, I have no idea.

      • AmbitiousProcess (they/them)
        link
        fedilink
        English
        33 days ago

        This feels most likely to me.

        Meta doesn’t exactly want to taint their brand image with purely sexual content being generated by their base models, so it’s probably for either content classification, and/or the also likely fine-tuning of their LLMs and other generative models in reverse - that is to say, fine tuning them to not create content that is like what they’re then being fed.

    • @megopie@beehaw.org
      link
      fedilink
      English
      8
      edit-2
      3 days ago

      A lot of artists will practice anatomy by drawing people nude, largely because it’s hard to get a good understanding of anatomy by only drawing people with clothes on.

      If you wanted to put some examples of bare human anatomy in odd positions to expand the range that the model is capable of, well there aren’t many larger corpuses of that than porn.

      Also, even if they don’t want it to make explicit content, they probably want it to make “suggestive” or “appealing” content. And they just assume they can guide rail it away from making actual explicit content. Although that’s probably pretty short sighted given how weak guardrails really are.

    • Cricket [he/him]
      link
      fedilink
      English
      83 days ago

      Let’s be honest now… Zuckerberg is building a globally-distributed, industrial-scale, disaster-proof spank bank for himself.

    • @WalnutLum@lemmy.ml
      link
      fedilink
      43 days ago

      Well, Stable Diffusion 3 supposedly purposefully removed all porn from their training and negatively trained the model on porn and it apparently destroyed the model’s ability to generate proper anatomy.

      Regardless, image generation models need some porn in training to at least know what porn is so that they know what porn is not.

      It’s part of a process called regularization, or preventing any particular computational model from over-fitting.