• @drislands@lemmy.world
    link
    fedilink
    English
    41 year ago

    calculate pi

    Isn’t that beyond a LLM’s capabilities anyway? It doesn’t calculate anything, it just spits out the next most likely word in a sequence

    • @hex_m_hell@slrpnk.net
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      Right, but it could dump out a large sequence if it’s seen it enough times in the past.

      Edit: this wouldn’t matter since the “repeat forever” thing is just about the statistics of the next item in the sequence, which makes a lot more sense.

      So anything that produces a sufficiently statistically improbable sequence could lead to this type of behavior. The size of the content is a red herring.

      https://chat.openai.com/share/6cbde4a6-e5ac-4768-8788-5d575b12a2c1