• Womble@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    20 days ago

    Microsoft’s penchant for making up names for thing that already have names is neither here nore there. It is an LLM, in fact its already twice as large as chatGTP2 (1.5B params).

    • habanhero@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      20 days ago

      I do think it’s a useful distinction considering open models can be more than 100B+ nowdays and GPT4 is rumored to be 1.7T params. Plus this class of models are far more likely to be on-device.