Energy & Sustainability
The environmental footprint of AI has become impossible to ignore. Training a single large language model can consume as much energy as several households use in a year and emit hundreds of tonnes of CO2. As AI adoption accelerates - with more models, larger models, and more inference requests - the energy demands are growing rapidly. Data centres already account for around 1-2% of global electricity consumption, and AI workloads are one of the fastest-growing components. This creates tensions between the push for ever-more-capable AI and the urgent need to reduce carbon emissions and manage resource consumption. The industry's response has been mixed - major tech companies have made ambitious renewable energy commitments, but some have seen their actual emissions rise as AI demand outpaces their sustainability efforts. Water consumption for cooling data centres has emerged as a particular concern in water-stressed regions. Understanding the environmental costs of AI is important for making informed decisions about when and how to use it. Not every problem needs a billion-parameter model, and the environmental cost of AI should factor into technology choices alongside performance, cost, and other considerations.