As artificial intelligence becomes central to everything from social media and search engines to virtual assistants and enterprise tools, tech giants like Google and Meta are making record-breaking investments to build the infrastructure needed to power it all — massive data centers being dubbed "AI factories."
Google has announced a $25 billion investment to build new data centers and AI infrastructure across the PJM Interconnection — the largest electric grid in the United States, which spans 13 states including Pennsylvania, Ohio, and New Jersey. These facilities will support Google’s AI workloads and cloud services, such as its Gemini models, and are expected to handle everything from video streaming and photo sharing to AI research.
To support the immense energy demands of these centers, Google is also investing $3 billion into hydropower — a renewable energy source — reinforcing its goal to become carbon-free by 2030.
“Data centers are a critical part of the AI production process and its deployment. Think of them as AI factories,” said Ramayya Krishnan, professor of management science at Carnegie Mellon University.
Meanwhile, Meta CEO Mark Zuckerberg has shared even more ambitious plans. He announced that Meta will spend “hundreds of billions of dollars” to develop computing infrastructure for artificial general intelligence (AGI) — AI systems that could one day exceed human capabilities.
Among the facilities in development are “Prometheus,” expected to launch in 2026, and “Hyperion,” which could scale up to 5 gigawatts (GW) over time — roughly the energy capacity of some small countries. Zuckerberg even shared a map outlining a single Meta data center with a footprint large enough to cover a portion of Manhattan.
Earlier this year, Meta also unveiled a 2GW facility under construction in Louisiana, part of its broader strategy to build scalable, AI-optimized supercomputing clusters across North America.
Data centers are the physical backbone of the digital world — housing servers, cooling systems, networking gear, and backup power to support cloud computing, real-time AI operations, storage, and app delivery. With the rise of generative AI and models like ChatGPT, Gemini, and Meta AI, demand for high-performance, energy-intensive data infrastructure is skyrocketing.
Unlike traditional data centers, these new AI-focused facilities are designed to handle massive volumes of data and compute power — making them essential to training and deploying next-generation models.
While these new AI hubs could drive local economic growth and create tech ecosystems in nearby areas, there are environmental and community concerns too. A recent survey by Airedale by Modine found that 70% of Americans wouldn’t mind living near a data center, citing job opportunities and infrastructure benefits. However, those against the idea mentioned energy strain, noise, and falling property values as top concerns.
“Data centers could raise energy prices for residential customers if the energy supply is limited, and they also use significant amounts of water,” Krishnan said. “But on the positive side, they can create an ecosystem of partners, increasing employment opportunities and driving regional growth.”
As AI tools like Meta AI, Gemini, and ChatGPT become integrated into daily life, the need for powerful and sustainable data centers will only grow. For companies like Google and Meta, controlling the infrastructure means faster innovation, better data security, and the ability to scale AI products globally.
The race to build AI infrastructure isn't just about technology — it’s about shaping the economy, energy landscape, and communities for decades to come.