Simply put
- A recent study indicates that by early 2026, the energy demand from AI technology is expected to surpass that of Bitcoin mining.
- In contrast to Bitcoin’s clear energy consumption metrics, major tech firms like Google and Microsoft often obscure specifics about AI-related energy usage, mainly reporting broader emissions tied to their AI activities.
- The research suggests that Nvidia has utilized between 44% and 48% of the world’s high-end chip packaging capabilities recently, with plans to increase production by 2025.
Remember back in 2021 when Bitcoin prices plummeted after Tesla announced it would stop accepting the cryptocurrency due to environmental concerns? People were really worried about the ecological footprint of proof-of-work mining. That sentiment hasn’t disappeared.
Currently, there’s this ambitious project underway at Xai, where efforts are focused on constructing the largest AI supercluster globally. This initiative is creating a buzz amidst governmental interest. Regulations are being drafted to support AI innovation, but the conversation around energy consumption is, well, quite sparse.
A newly published peer-reviewed study highlighted in *Scientific Magazine* shows that by the end of 2025, AI could account for nearly half of all global data center energy use.
Alex De Vries-Gao, a PhD candidate at Vrije Universiteit Amsterdam and a critic of Bitcoin’s energy use, estimates that AI’s electricity needs could reach around 23 gigawatts by January 1st, whereas Bitcoin currently consumes approximately 176 terawatt-hours annually.
De Vries-Gao shared insights on LinkedIn, noting that major corporations like Google acknowledge a “power capacity crisis” as they try to expand their data center footprints, yet they don’t provide detailed figures for their energy consumption.
In his observations, he remarked that since the advent of ChatGPT, tracking actual energy usage in AI has become increasingly complex. AI’s power demands are not as transparent as Bitcoin’s, which anyone can calculate with ease. On the other hand, companies like Microsoft and Google are raising alarms about rising electricity use and carbon emissions, but lack clarity on how much of that is driven specifically by AI.
Despite these concerns, available data has primarily been presented in aggregate form, making it hard to isolate AI’s impact. To get around this, De Vries-Gao tracked chip packaging capabilities from Taiwanese semiconductor companies, which hold crucial technology for advanced AI chips.
He explained his methodology with a relatable analogy regarding business cards: if you know how many can fit on a sheet and how many sheets the printer can handle, you can determine the total output. He applied this reasoning to semiconductors, citing statements from TSMC executives who confirmed the “strict capabilities” of their manufacturing processes, acknowledging that they can’t fulfill all customer demands.
His findings revealed that Nvidia alone accounted for an estimated 44% to 48% of TSMC’s CO-WOS capacity in 2023 and 2024. Combined with AMD’s outputs, these companies were projected to use about 3.8 gigawatts of electricity for AI chips, excluding contributions from other manufacturers.
Looking ahead, De Vries-Gao predicts that AI’s power demand could hit 23 gigawatts by the end of 2025, with TSMC planning to enhance its CO-WOS capacity further. This suggests a continued rise in electricity demand, a fact underscored by Nvidia and AMD reporting record earnings, alongside OpenAI unveiling a $500 billion data center project. Quite frankly, AI is shaping up to be the most lucrative sector in tech right now.
So, as things stand, it looks like the environment might have to take a back seat for a while.



