• merari42@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I call bullshit. Stable Diffusion XL has energy footprint of about 0.29 watt hours per image while generating. That is roughly equivalent to running a 0.5 Watt energy LED light bulb for slightly less than 35 minutes. Even for training the costs are not that extreme. Stable Diffusion needed 150,000 GPU hours. At 300 Watt for an A100 at full load that would 45,000 kWh. This roughly the energy neeed to drive an electric car for 180,000 miles, which is a lot, but still on a reasonable scale.

    • SuperSpruce@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      I don’t like that first article, it gives contradicting information about the energy usage per image, saying 0.29kWh/image then saying 0.29kWh/1000 images.

      • Riskable@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        The article is way, waaaaaaay off. My PC generates images at a rate of about one per second (SDXL Turbo) with an Nvidia 4060 Ti which uses 160W (~12W when idle). Let’s assume I have it generate images constantly for one hour:

        • 3,600 images per 0.16kWh
        • About 22,500 images per hour.

        In other words, generating a single image is a trivial amount of power.

        • SuperSpruce@lemmy.zip
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          How are you able to generate images so quickly? When I tried to run Stable Diffusion on my nominally comparable GPU (RX6800M, similar to 6700XT, 12GB VRAM), it took over a minute to get halfway to generating a medium sized image before my entire computer crashed.

          • merari42@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            SDXL Turbo I guess. This needs only one diffusion pass per image, while SD 1.5 needs 8 for most images.

      • merari42@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Good point. I just tried it on my M1 macbook with 16 gb ram to get better data (but used SD 1.5 though). It took roughly 2 minutes to create an image at roughly full power load (which I would conservatively assume to be roughly identical to the 96 Watt max load of my power adapater.). So i’s 3.2 watt hours per image (with my inefficient setup)

    • Marcbmann@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      You are kinda missing the point.

      It’s not about how energy efficient or inefficient a single ChatGPT prompt is.

      It’s that A/C is arguably more important to an individual than your ability to use AI. But while the government asking people to reduce AC usage is not new, AI is.

      So we’re introducing new and unnecessary ways to draw power while asking people to tolerate higher temperatures within their homes.

      My personal take is that we should be investing in nuclear power so we continue evolving as a society. But I guess we can hold back progress in the name of puttering along with other technology as the world slowly burns and people cook inside their homes

    • BougieBirdie@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Okay, but corpos aren’t training one model and being done with it. They’re training thousands of models, tweaking hyperparameters to find the correct fine tuning needed.

      Also, putting the scale at 180,000 miles of driving makes it sound more insane to me. The earth is like 25,000 miles. If you could drive on the ocean, you could circumnavigate the globe seven times over!

    • meowMix2525@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      At 300 Watt for an A100 at full load that would 45,000 kWh. This roughly the energy neeed to drive an electric car for 180,000 miles, which is a lot, but still on a reasonable scale.

      My guy. That is over 15 years of daily driving and the occasional long haul trip, 1.5x the average lifespan of an EV. Consumed in under 2 years. For ONE iteration of ONE AI model. Nevermind how many thousands of people are running that “light bulb for slightly less than 35 minutes” every second, with the vast majority of what it spits out not even being used for anything of value except to tell the prompt writer what they need to tweak in order to get their perfect anime waifu out of it.

      • merari42@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Still not much on an industrial scale. For example, you can compare it to the aviation industry. There are roughly 550 transatlantic flights per day and each one consumes about 5000kg of fuel per hour for 6 to 10 hours straight. A kg of jet A1 has roughly 11 kWh. So a single transatlantic flight consumes roughly 385,000 kWh of energy. So training one model still consumes a lot less energy than a single one of the 550 transatlantic flights daily.

        • Bytemeister@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          Not sure why people rip on commercial air travel so much.

          Some “back of the napkin math here”.

          A380 can hold 84,545 gallons of fuel, and has a range of 9200 miles, giving it a fuel economy of roughly 0.1MPG…

          Except it can carry 853 people at a time. At 1/3rd capacity, it exceeds the average fuel economy per person per mile than a car with a single person in it in the US. (26mpg). At full capacity it’s around 85 mpg/person.