
Washington, United States (Enmaeya News) — Major tech companies are spending huge amounts of money on artificial intelligence, raising concerns about energy use and environmental impact.
Meta and Microsoft are building new nuclear power plants to run their data centers. OpenAI, with support from former President Donald Trump, announced its “Stargate” project, planning to spend $500 billion on up to 10 data centers—each one could use more electricity than the entire state of New Hampshire.
Apple also plans to spend $500 billion on U.S. manufacturing and data centers over the next four years. Google expects to spend $75 billion in 2025 alone on AI infrastructure. Experts say this level of spending is unlike anything seen before in technology. Between 2005 and 2017, data center electricity use stayed mostly flat, even as internet services like Facebook and Netflix grew rapidly, thanks to more efficient technology.
Since 2017, AI has changed that. Modern data centers now use more power-hungry equipment, doubling electricity consumption by 2023. Research shows that U.S. data centers now use about 4.4% of the country’s total electricity.
Experts warn that this may be just the beginning. By 2028, more than half of all electricity used in data centers could go to AI applications alone—roughly 22% of the electricity used by all U.S. households.
Training AI models takes massive computing power. Billions of calculations must be made using thousands of high-performance GPUs, CPUs, and AI processors called TPUs. These machines often run nonstop for weeks or even months.
Only a few large companies can afford these costs. Smaller companies take longer to train AI models, which can actually use more energy overall. For example, training GPT-3 used about 1,287 megawatt-hours of electricity, producing 502 metric tons of CO₂—equal to the yearly emissions of 112 cars.
Worldwide, data centers used 460 terawatt-hours of electricity in 2022. By 2026, that could exceed 1,000 terawatt-hours, about one-third of all electricity generated by nuclear reactors globally. China’s data centers could reach 400 terawatt-hours by 2030, while the U.S. Northeast expects rising electricity demand from AI centers.
Cooling these data centers adds even more energy use. Cooling systems can use up to 40% of a data center’s electricity. Water is also needed, which can strain local supplies. In The Dalles, Oregon, Google’s data centers use over 25% of the city’s water.
Space is another challenge. Switch Tahoe Reno, one of the largest U.S. data centers, covers 669,000 square meters, bigger than the Pentagon. Globally, the number of data centers grew from under 8,000 in January 2021 to almost 11,000 by November 2023. Analysts expect this growth to continue as AI demand rises.


