Why doesn’t AI Usage include co2e from water consumption to cool data centres?
There is also concern about water consumption in relation to AI. One study puts this at roughly 500ml for each text-based conversation (roughly 20-50 queries, with medium length responses (150–300 words))[1]. This includes water evaporated on-site to cool the servers (via cooling towers) and water evaporated at the power plant to generate the electricity that runs the servers. The estimate does not include water use for the initial training of the model, although this is also discussed in the paper.
Although this piece of research exists, the scope of research doesn’t extend to video and image responses. Additionally, the research scope extends to water used to generate the original electricity used to cool the servers. Although users can measure electricity used for multiple applications on a production (e.g. powering a studio or an office), water used in the generation of the electricity is not included.
Therefore to apply the research, we would need to understand how much of the 500ml related to each inference, and only apply that amount. In addition, we would need to make further assumptions about the amount required for an image or video conversation.
It was concluded that too many assumptions would need to be made to estimate water used per response generated. Therefore, at present the AI calculations do not include CO2e impact from water consumption related to cooling data centres.
It should be noted however that as more data centre operators move towards direct-to-chip liquid cooling, and or even better, immersion cooling, we’re hopeful that the consumption will reduce in the coming months and years – especially as regulatory pressure increases.
[1] University of California, Riverside "Making AI Less 'Thirsty': Uncovering and Addressing the Secret Water Footprint of AI Models" April 2023.