• stonehopper@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Wait a sec, how do they consume water for cooling, i thought it’s in a closed loop as its purpose is only transferring heat

    • thunderfist@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Some facilities is do this. They’re not 100% efficient, so some is lost to evaporation, some must be dumped because it has too much mineral content (and too much conductivity) to go back through the cooling system. Reusing is only about 50% efficient (according to Google’s numbers).

    • Krauerking@lemy.lol
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      5 months ago

      Half a liter per kilowatt hour. That’s the average water use

      It’s like the idea of recycling plastics with water.
      Not all of it is reusable to the same degree. A good portion of water has to be evaporated off to cool down the exterior towers plus water isn’t really infinitely usable in these loops. It gets gross or full of materials.

      Another thing that people need to remember is generating electricity uses the water here as we literally don’t use many methods that don’t involve water, we are not on a green grid and neither are these huge data centers for the most part. We boil it for the electricity then have to use additional to clean the system after.

    • scutiger@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      On a standard PC, you can easily have a loop because the radiator is big enough to exhause all that heat. But when your computer or cluster puts out multiple thousands of watts of heat, eventually you need to get rid of tge hot water and replace it with cold water. And when it gets even hotter, you need a steady stream of cold water that immediately gets dumped.