• SuperSpruce@lemmy.zip
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I don’t like that first article, it gives contradicting information about the energy usage per image, saying 0.29kWh/image then saying 0.29kWh/1000 images.

    • merari42@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Good point. I just tried it on my M1 macbook with 16 gb ram to get better data (but used SD 1.5 though). It took roughly 2 minutes to create an image at roughly full power load (which I would conservatively assume to be roughly identical to the 96 Watt max load of my power adapater.). So i’s 3.2 watt hours per image (with my inefficient setup)

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      The article is way, waaaaaaay off. My PC generates images at a rate of about one per second (SDXL Turbo) with an Nvidia 4060 Ti which uses 160W (~12W when idle). Let’s assume I have it generate images constantly for one hour:

      • 3,600 images per 0.16kWh
      • About 22,500 images per hour.

      In other words, generating a single image is a trivial amount of power.

      • SuperSpruce@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        How are you able to generate images so quickly? When I tried to run Stable Diffusion on my nominally comparable GPU (RX6800M, similar to 6700XT, 12GB VRAM), it took over a minute to get halfway to generating a medium sized image before my entire computer crashed.

        • merari42@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          SDXL Turbo I guess. This needs only one diffusion pass per image, while SD 1.5 needs 8 for most images.