Discussion about this post

User's avatar
Will G.'s avatar

Love this!

suman suhag's avatar

Technically, water is not “consumed” (used up), even when people drink it. It makes its way back to nature, eventually to the oceans, where it evaporates to become rain-clouds providing fresh water.

AI (and all massive computer facilities) use water for cooling, returning the water to nature much warmer than how it was obtained. In essence, water is not being “consumed”. It is conveying the “consumed” concentrated electrical energy into dispersed heat energy. The shrinking of old and massive “vacuum tube” computer facilities down to tiny microchips has vastly reduced the “heat produced per calculation”, by a factor of a trillion, but we are doing millions of trillions of times as much calculations as back in the 1960’s.

So, we must employ revised architectures - devised specifically for AI-styles of calculations, in order to reduce the “heat footprint” such calculations impose upon the environment. These architectures, down at the microscopic chip-level, need to distribute the input data throughout the compute-fabric, rather than continually fetch it to and from RAM over data buses (an activity responsible for 95% of the energy expended in typical waves of matrix calculations.)

But I’m not worried about the water - it’s not going anywhere. It just needs to be cooled without harming the ecosystem.

No posts

Ready for more?