• @cyd@lemmy.world
      link
      fedilink
      English
      501 year ago

      That’s not such a big deal. Their objective is to get people hooked on the system. After that, they’ll jack up the price. Microsoft can easily afford to lose money for several years in pursuit of that target.

      (One way this plan could fall through is if LLM tech progresses to the extent that free and open source copilots, run locally, can give result that are just as good.)

      • @Pechente@feddit.de
        link
        fedilink
        English
        221 year ago

        One way this plan could fall through is if LLM tech progresses to the extent that free and open source copilots, run locally, can give result that are just as good.

        MS might be in trouble then.

        Performance is not great but apparently it’s not optimized at all as of right now.

      • HidingCat
        link
        fedilink
        61 year ago

        Not familiar with the tech, but wouldn’t server-side LLMs still have an advantage regardless because of the greater power available on tap? Anything that improves local LLM will also benefit server-side LLMs, wouldn’t it?

        • my_hat_stinks
          link
          fedilink
          English
          101 year ago

          Not necessarily, as it gets faster the latency between your local and remote machines becomes a bigger fraction of the time taken to process anything. If your local machine processes in 50ms and the remote machine in 5s, a latency of just 45ms would make your machine faster.

          Running locally also cuts out a lot of potential security issues inherent to sending data over a network, and not sending your data to a third party is a bonus too.

        • @bamboo@lemm.ee
          link
          fedilink
          English
          101 year ago

          Possibly, but given the choice between paying $20/m for a marginally better version of something that’s free and probably built in to your editor at that point, most people would probably take the free thing. At that point paid llms will need to find new niches beyond simply existing.

    • @worldsayshi@lemmy.world
      link
      fedilink
      English
      12
      edit-2
      1 year ago

      That seems so weird when you think about the pricing for openai API. It feels at least an order of magnitude cheaper than using chatgpt plus subscription, which in turn is $20/month. If Copilot is losing money, openai must be burning money by truckloads.

    • @pineapple_pizza
      link
      English
      91 year ago

      We’re in the sliceline era of generative ai, enjoy it before prices get hiked

      • nicetriangle
        link
        fedilink
        61 year ago

        Yep everyone’s trying to capture market share and stamp out any competitors with shorter funding runways until they achieve some amount of monopolization over the customer base. Then comes the price hikes and other anti consumer bullshit.

  • Phanatik
    link
    fedilink
    71 year ago

    Yeah, I’m sure Microsoft is happy with the theft of copyrighted works and people’s personal information.