• tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    So, first, that’s just a reduction. But set that aside, and let’s talk big picture here.

    My GPU can use something like 400 watts.

    A human is about 100 watts constant power consumption.

    So even setting aside all other costs of a human and only paying attention to direct energy costs, if an LLM running on my GPU can do something in under a quarter the time I can, then it’s more energy-efficient.

    I won’t say that that’s true for all things, but there are definitely things that Stable Diffusion or the like can do today in a whole lot less than a quarter the time it would take me.

    • milicent_bystandr@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      That said, the LLM isn’t running an array of bonus functions like breathing and wondering why you said that stupid thing to your Aunt’s cousin 15 years ago and keeping tabs on your ambient noise for possible phone calls from that nice boy who promised to call you back.

    • wischi@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      The problem is that using those tools no matter how energy efficient will add to the total amount of energy humans use, because even if an AI generates an image faster than a human could, the human still needs 100W constantly.

      This doesn’t mean, that we shouldn’t make it more efficient but let’s be honest, more energy efficient AI just means that we would use even more AI everywhere.

      • derpgon@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        But speaking of efficiency, a human can do more useful tasks while AI is crunching numbers. But that is very subjective.