Why doesn’t AI Usage include co2e from the electricity consumed in the initial training phase for the model?
The initial training phase generates a finite amount of CO2e. This would need to be distributed across equally across all responses generated for the lifetime of the model, which is impossible to quantify.
Additionally, Hiili feels that these emissions should not be attributed to the users of AI services - the emissions of training the models belong to the AI services that undertake that training so that they can offer the service. This is in line with scope of other activities measured in the tool, where the use of the item is covered, but not it’s manufacture - for example with car travel, the fuel is what we’re measuring, not the manufacture of the car itself.
In practice, inference can account for up to 90% of the total energy consumed over a model’s lifecycle [1, 2, 3].
[1] Amazon EC2 Update – Inf1 Instances with AWS In-ferentia Chips for High Performance Cost-Effective Inferencing — AWS News Blog. Dec. 3, 2019. URL: https://aws.amazon.com/blogs/aws/amazon-ec2-update-inf1-instances-with-aws-inferentia-chips-for-high-performance-cost-effective-inferencing/
[2] Noelle Walsh. How Microsoft Measures Datacenter Water and Energy Use to Improve Azure Cloud Sustainability. Microsoft Azure Blog. Apr. 22, 2022. URL: https://azure.microsoft.com/en-us/blog/how-microsoft-measures-datacenter-water-and-energy-use-to-improve-azure-cloud-sustainability/
[3] Carole-Jean Wu et al. “Sustainable AI: Environmental Implications, Challenges and Opportunities