@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 1 year agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square230fedilinkarrow-up1905arrow-down119
arrow-up1886arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 1 year agomessage-square230fedilink
minus-square@Mahlzeit@feddit.delinkfedilinkEnglish1•1 year agoOh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought. I did say “making available” to exclude “hacking”.
minus-squareJackbyDevlinkfedilinkEnglish1•1 year agoThe point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.
Oh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought.
I did say “making available” to exclude “hacking”.
The point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.