The article is way, waaaaaaay off. My PC generates images at a rate of about one per second (SDXL Turbo) with an Nvidia 4060 Ti which uses 160W (~12W when idle). Let’s assume I have it generate images constantly for one hour:
3,600 images per 0.16kWh
About 22,500 images per hour.
In other words, generating a single image is a trivial amount of power.
How are you able to generate images so quickly? When I tried to run Stable Diffusion on my nominally comparable GPU (RX6800M, similar to 6700XT, 12GB VRAM), it took over a minute to get halfway to generating a medium sized image before my entire computer crashed.
The article is way, waaaaaaay off. My PC generates images at a rate of about one per second (SDXL Turbo) with an Nvidia 4060 Ti which uses 160W (~12W when idle). Let’s assume I have it generate images constantly for one hour:
In other words, generating a single image is a trivial amount of power.
How are you able to generate images so quickly? When I tried to run Stable Diffusion on my nominally comparable GPU (RX6800M, similar to 6700XT, 12GB VRAM), it took over a minute to get halfway to generating a medium sized image before my entire computer crashed.
SDXL Turbo I guess. This needs only one diffusion pass per image, while SD 1.5 needs 8 for most images.