AI-related energy use arises at multiple stages, including
model training,
fine-tuning,
inference, storage, networking, and supporting infrastructure such as cooling and power conversion. Published estimates of energy use per AI request vary widely across models, tasks and measurement methods. In that benchmark, simple classification tasks consumed about 0.002–0.007 Wh per prompt on average (about 9% of a
smartphone charge for 1,000 prompts), while text generation and text summarisation each used about 0.05 Wh per prompt; image generation averaged 2.91 Wh per prompt, and the least efficient image model in the study used 11.49 Wh per image (roughly equivalent to half a smartphone charge). Comparisons between AI systems and human labour for specific tasks have produced mixed results and remain sensitive to assumptions about output quality, workload and system boundaries. A 2024 study in
Scientific Reports reported 130 to 2900 times lower estimated carbon emissions for selected AI systems than for human writers and illustrators under its assumptions. A later
Scientific Reports paper reported a counterexample for programming tasks under its assumptions, finding 5 to 19 times higher estimated emissions for the evaluated AI system than for human programmers on the benchmark used in that study.
System level Energy use and efficiency }} AI electricity intensity depends not only on model architecture but also on hardware and facility efficiency. Data-centre operators commonly report
Power usage effectiveness (PUE), which measures the ratio of total facility energy to
IT equipment energy; a lower PUE indicates less overhead energy for cooling and other supporting infrastructure. The International Energy Agency has also reported that data centres remain a relatively small share of global electricity use overall, but that their local effects can be much more pronounced because demand is geographically concentrated. Accounting methods that include upstream or embodied impacts, such as hardware manufacture and facilities construction, can materially affect estimates of AI-related emissions.
Decisions and strategies by individual companies Large technology companies have reported that the expansion of AI and
cloud infrastructure affects their sustainability targets, electricity demand, and resource use. Google, for example, attributed part of its emissions growth in 2023 to increased data-centre energy consumption and supply-chain emissions in its 2024 environmental report. Cloud and AI companies have also announced measures intended to reduce environmental impacts, including investment in more efficient hardware, low-carbon electricity procurement, alternative cooling systems, and water
stewardship programmes. The extent, comparability, and third-party verification of such disclosures vary between firms and jurisdictions. == Water usage ==