I’m kind of new to local AI and wondering what’s the move here? Are they trying to pull off a chrome/android situation? Obviously I don’t trust any of these gafam giants but I would be really interested in running a local LLM on my M1 max (briefly used deepseek last year). My use case would be mostly chat functions to help with academic and text analysis tasks (don’t worry I don’t just blindly trust LLMs, I know what I’m doing), so recommendations are welcome.

    • Yerbouti@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      19 days ago

      co-opt em, set the de-facto standards and own the “attention”

      That’s what I’m suspecting too. They’re trying to pull a chrome-like situation and become some sort of standard so devs eventually are stuck with their tech and whatever bullshit “manifest” update they release.

      • Jakeroxs@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        19 days ago

        While not impossible, there’s a lot of variables when it comes to LLMs that I don’t really see how they’d do that, especially since it’s not particularly better then the competition.

        • Yerbouti@sh.itjust.worksOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 days ago

          I agree. I’m thinking they are trying anything hoping it will stick because clearly, they are losing the AI race. So offering models that can run locally was their best bet. And deepseek might have just fuck their shit up a little more with V4.