• Blue_Morpho@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    20 days ago

    Nothing AI about it.

    Voice processing is AI and was done by Apple servers. Previously, only the keyword “Hey Siri” was local. Onboard AI chips will allow this to be local. The actual queries will go to the servers. Phones do not have the power to run useful LLM locally- at least not with the near instantaneous response times phone users expect. A 56 Watt 128GB RAM M3 Max does around 8.5 tokens/second.

    https://www.nonstopdev.com/llm-performance-on-m3-max/

    • PassingThrough@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      20 days ago

      Onboard AI chips will allow this to be local.

      Phones do not have the power to ~~~

      Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer? Gotta have those latest and greatest chips.

      It will be fun to see how it all shakes out. If the AI can’t run most queries on the phone with all this advertising of local processing…there’ll be one hell of a lawsuit coming up.

      EDIT: Finished looking for what I thought I remembered…

      Additionally, Siri has been locally processed since iOS 15.

      https://www.macrumors.com/how-to/use-on-device-siri-iphone-ipad/

      • Blue_Morpho@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        20 days ago

        Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer?

        I’m not guessing. I linked to the article about the M3 which is much more powerful than the a17 pro in the 15 pro and has the same NPU.