@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 10 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square232fedilinkarrow-up1905arrow-down119
arrow-up1886arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 10 months agomessage-square232fedilink
minus-square@Mahlzeit@feddit.delinkfedilinkEnglish1•10 months agoOh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought. I did say “making available” to exclude “hacking”.
minus-squareJackbyDevlinkfedilinkEnglish1•10 months agoThe point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.
Oh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought.
I did say “making available” to exclude “hacking”.
The point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.