My daughter was talking with a friend over iMessage. They were like playing a game where they search all emojis with a certain color. When she tried to send the message in this picture, the iPhone rejected, with the alert saying she was sending a nude. There’s nothing there, only emojis. What could’ve triggered the warning?

  • @over_clox@lemmy.world
    link
    fedilink
    31
    edit-2
    13 days ago

    Probably the fact that it contains the ‘tie the knot’ emoji and the ‘squirt’ emoji…

    Edit: Those spotted in the last 3 rows, if it helps locate them.

      • @over_clox@lemmy.world
        link
        fedilink
        412 days ago

        It all depends on how one interprets ‘tie the knot’, and I have no idea how Apple’s software interprets it…

        🐕🪢🐩…

        • @blackbrook@mander.xyz
          link
          fedilink
          212 days ago

          You could say that about anything. That you called out that emoji suggests you know of some particular interpretation of it that relates to nudity. Care to share?

          • @over_clox@lemmy.world
            link
            fedilink
            312 days ago

            I am not here to teach people about how K9 reproduction works. I posted enough with those 3 emojis.

            If you don’t already know, then you don’t need to know.

            • @blackbrook@mander.xyz
              link
              fedilink
              211 days ago

              Ha, I hadn’t paid any attention those dog emojis (I think I tend to filter those out). Ok, seems a stretch, but I guess who knows what weirdness you may get with overuse of AI.

              • @over_clox@lemmy.world
                link
                fedilink
                111 days ago

                Indeed, hard to say what the hell triggered Apple or their AI shit, just spotting a random guess. Clearly it’s not a nude though.

  • @BCsven@lemmy.ca
    link
    fedilink
    2313 days ago

    Is it those images where you unfocus your eyes and it appears lol. Could be the image hash happens to be a match to a nude hash??

    • @argh_another_username@lemmy.caOP
      link
      fedilink
      912 days ago

      The phone protect kids from sending nudes. (AFAIK it happens on the phone, not on a server). The phone has a series of protections for kids. She can’t message someone I don’t approve, no message or call from people not on the contact list can enter. But still, someone could grab her friend’s phone and ask for nudes. In this particular case, I entered my parent code and the message was sent. I fail to see how this is bad.

      • atro_city
        link
        fedilink
        -512 days ago

        That entire context was missing in your post: you have parental controls activated aka you opted into scanning.

        The post was presented as if your daughter just had a normal iPhone + not everybody uses an iPhone. You can’t expect people to know that the feature exists + that it’s activated on your daughters phone. I read daughter and thought she was an adult.

        • @over_clox@lemmy.world
          link
          fedilink
          311 days ago

          What does that have anything to do with the price of weed in Colorado?

          The post isn’t even a photo, it’s just a really long string of emojis. So, how in the holy hell did it somehow detect a nude photo out of that in the first place?

          • atro_city
            link
            fedilink
            211 days ago

            Price of weed? What?

            As for the nude, it probably detected the “💦” which I imagine is often sent with nudes or in discussions surrounding nudes. But who knows. Malus will never reveal its sauce.

            • @over_clox@lemmy.world
              link
              fedilink
              211 days ago

              Colorado was the first state to allow recreational marijuana use.

              It’s my own twist on ‘What does that have to do with the price of eggs in Spain?’

              I have no idea where that saying came from, but it all adds up to what’s that got to do with anything?

    • @KoalaUnknown@lemmy.world
      link
      fedilink
      14
      edit-2
      12 days ago

      It’s a feature that parents can toggle on to help prevent their kids from sending CSAM, and receiving unwanted dick pics. The choice of censorship is entirely on the parent.

      The emojis are a little far tho…

  • Bezier
    link
    fedilink
    1312 days ago

    Iphones may refuse to send “inappropriate” messages? Fuck that.

    The problem is that you can’t detect such things accurately and that some clowns still want to force it on everyone.

  • @tal@lemmy.today
    link
    fedilink
    English
    913 days ago

    I don’t know, but start shaving off emojis to see when it’ll accept it and you might get some hints.

  • SnekZone
    link
    fedilink
    612 days ago

    I bet there is programms out there, that can convert pictures into rows of emojis, just like with ascii-art. Maybe thats what they’re trying to guard against 🤷‍♂️

  • Resol van Lemmy
    link
    fedilink
    English
    511 days ago

    The flag of South Sudan uses the wrong shade of blue. Or at least I think that’s what’s wrong since I’m not sure software vendors care about the exact color values of flags. Some operating systems still use the darker blue for Honduras for example.

  • @creamlike504@jlai.lu
    link
    fedilink
    English
    1
    edit-2
    12 days ago

    Slight tangent: Would blocking still work if they tried to send nudes via Snapchat?

    I’m not “asking for a friend,” I’m just curious how effective this feature could possibly be.