DAMN STOVER FROM THE BULLETIN OF THE ATOMIC SCIENTIST
When Microsoft bought a 407-acre pumpkin farm in Mount Pleasant, Wisconsin, it wasn’t to grow Halloween jack-o’-lanterns. Microsoft is growing data centers—networked computer servers that store, retrieve, and process information. And those data centers have a growing appetite for electricity.
Microsoft paid a whopping $76 million for the pumpkin farm, which was assessed at a value of about $600,000. The company, which has since bought other nearby properties to expand its footprint to two square miles, says it will spend $3.3 billion to build its 2-million-square-foot Wisconsin data center and equip it with the specialized computer processors used for artificial intelligence (AI).
Microsoft and OpenAI, maker of the ChatGPT bot, have talked about building a linked network of five data centers—the Wisconsin facility plus four others in California, Texas, Virginia, and Brazil. Together they would constitute a massive supercomputer, dubbed Stargate, that could ultimately cost more than $100 billion and require five gigawatts of electricity, or the equivalent of the output of five average-size nuclear power plants.
Microsoft, Amazon, Apple, Google, Meta, and other major tech companies are investing heavily in data centers, particularly “hyperscale” data centers that are not only massive in size but also in their processing capabilities for data-intensive tasks such as generating AI responses. A single hyperscale data center can consume as much electricity as tens or hundreds of thousands of homes, and there are already hundreds of these centers in the United States, plus thousands of smaller data centers.
Connect with us on our socials: