Unveiling AI’s Massive Energy Consumption and Its Future Impact
AI's energy consumption extends far beyond simple chatbot queries, with video generation demanding exponentially more power. This article explores AI’s current energy use and the broader environmental challenges it poses.
Measuring AI's Energy Usage
After months of investigation, our story on AI’s energy consumption and emissions burden was published last week. The original goal was straightforward: to calculate the energy used every time a chatbot is queried and aggregate this to understand why AI companies and government officials are aiming to harness unprecedented electricity levels to power AI, potentially reshaping energy grids.
AI Technology Is Still Emerging
Our research focused on energy used for chatbot interactions, image generation, and AI video creation. However, these represent only a small fraction of AI’s future energy demands. Many companies are developing complex reasoning models that require longer processing times and more energy. Additionally, hardware devices with AI continuously running in the background, digital assistants, and AI clones are pushing energy consumption higher. This explains the enormous investments by companies like OpenAI in energy infrastructure.
At the same time, AI is still in its infancy, meaning improvements in models, chips, and cooling technologies could lead to greater efficiency, altering this trajectory.
The Energy Intensity of AI Video
Our tests revealed that creating even a low-quality, five-second AI-generated video consumes 42,000 times more energy than a chatbot answering a simple question, enough to power a microwave for over an hour. Google recently introduced its new Veo video model, and although exact energy data was not disclosed, it likely demands even more energy than the models we tested. The widespread use of AI video, especially if it becomes cheap and ubiquitous on social platforms, could significantly increase emissions.
Beyond Individual Footprints: Larger Environmental Concerns
While individuals might worry about their AI usage's carbon footprint, most chatbot interactions have minimal impact. Video generation might contribute more. Yet, broader environmental issues loom larger. For example, data centers in Nevada withdraw significant water resources, attracted by tax incentives, impacting local ecosystems. Meta’s large data center in Louisiana relies heavily on natural gas despite clean energy commitments. Furthermore, nuclear power is not the straightforward solution AI companies often claim.
The real challenge lies in understanding global energy policies, resource access, and the opaque nature of AI companies’ energy consumption disclosures. These factors, rather than individual usage, are critical in assessing AI’s environmental impact.
This article was originally published in The Algorithm newsletter. To receive similar stories promptly, consider subscribing.
Сменить язык
Читать эту статью на русском