Google’s quiet move to plant a multi-billion-dollar stake in a quiet rural area of Hermantown has brought the reality of a global technological revolution to the Head of the Lakes.
Controversy over this development revolves around the aesthetics of a massive warehouse-type development, its impact on the environment, local real estate values, and the fact that city and county staff and some elected officials were forced to sign Non-Disclosure Agreements (NDAs) with the project developer, keeping the public unaware of the planned development. That last one has led the Minnesota Legislature this session to draft legislation to ban NDAs for public officials. In the woods of Northeastern Minnesota, transparency is usually the rule of the trail. In Hermantown, transparency hid behind corporate desires for silence.
SO, WHAT IS AI?
The concept of AI has been around for centuries in myth and fiction, but as a scientific field, it is about 70 years old. You might remember the movie 2001: A Space Odyssey, in which the onboard computer HAL used its artificial intelligence to take over the spaceship.
The transition from “science fiction” to “science fact” of AI happened in the mid-20th century. In 1950, Alan Turing, a British mathematician and scientist widely considered to be the godfather of theoretical computer science and artificial intelligence, published a seminal paper asking, “Can machines think?” and proposed the Turing Test to measure machine intelligence.
The term “Artificial Intelligence” was officially coined by John McCarthy at the Dartmouth Summer Research Project in the mid-1950s. In 1997, IBM’s Deep Blue used its artificial intelligence to outplay a world chess champion. The “Deep Learning” revolution began, fueled by massive amounts of data, powerful new computer chips, and significant capital investment, leading to the generative AI we see today.
If you’ve ever asked Alexa for the weather or told Siri to set a timer, you’ve engaged with a Narrow AI. These systems aren’t just listening; they are using Natural Language Understanding to decipher your intent and Machine Learning to adapt to your specific Minnesota accent. Both technologies recognize your speech and convert it into text that a computer can read. They understand to a certain level the intent behind your words. The systems are designed to improve over time by learning from millions of user interactions to better understand accents and dialects. Generative AI uses Large Language Models (LLMs) to have more natural, free-flowing conversations rather than just following rigid scripts.
AI is not just a faster search engine. While traditional computers follow a rigid ‘If X, then Y’ rulebook, Generative AI uses Neural Networks— digital architectures inspired by the human brain—to identify patterns. It doesn’t just ‘search’; it predicts. It’s the difference between a filing clerk and a master chess player.
THE HERMANTOWN PROJECT
The developer of the Hermantown project, without disclosing the site’s ultimate owner, sought changes to zoning and long-established planning documents to allow the construction of what was called “Project Loon.” The site is a 403-acre plot, much of which is currently undeveloped, wooded, and marshy, and located near Minnesota Power’s Arrowhead substation. The campus is planned to feature four buildings totaling 1.8 million square feet. For a local perspective, this footprint is roughly the size of three U.S. Bank Stadiums or two-thirds of the Mall of America. But unlike a stadium or a shopping mall, these buildings won’t be filled with people; they will be packed with rows of humming, heat-generating “digital brains.”
The power requirements for a “hyperscale” data center like this are massive compared to those of typical residential or light industrial applications. Google has an agreement with Minnesota Power to develop and use 700 Megawatts (MW) of clean energy (300MW wind and 400MW battery storage) to support the campus. The 700MW capacity of this single facility is a staggering 100 times the peak demand of the entire City of Two Harbors. In essence, Google is building a ‘city within a city,’ one that will eventually draw over a third of Minnesota Power’s total capacity.
This one data center will draw enough power to represent nearly 35% of Minnesota Power’s current generating capacity, which is estimated at 2,000MW.
Water usage at the Google center has been one of the most debated aspects of the environmental review. Current estimates suggest the campus will use roughly 50,000 gallons of water per day. Google has indicated they intend to use closed-loop air cooling or “dry cooling” for this site rather than traditional evaporative cooling towers, so the daily water usage will be relatively low for a facility of this size, as the water is used primarily for humidity control and domestic needs rather than being “boiled off” for cooling. In any event, water usage is projected to be comparable to the usage of about 160 homes or a large apartment complex. The City of Hermantown plans a new elevated water tower to ensure the local water supply can handle the 10% increase in total city water demand that the project represents.
A TRAIL OF DEVELOPMENT
To understand this recent surge in massive centers, we should look at the mechanical evolution of the computer itself. What began as a tool for counting has become a machine for ‘thinking,’ and that shift requires a massive increase in physical infrastructure.
In the 1970s, computing was centralized. If you worked for a decent-sized manufacturing company, you knew a few things about the large room where the mainframe computers worked their magic. The room was restricted to a handful of workers, it sat on a false floor to keep vibration to a minimum, and it used an independent HVAC system to keep the machines cool. A highend mainframe in the 1970s had less processing power than a modern smartphone and was designed mainly for tasks such as accounting, payroll, and database management.
In 1965, Intel co-founder Gordon Moore predicted that the number of transistors on a microchip would double approximately every two years, while the cost would halve. The exponential growth that followed moved the computer from the locked room in the factory to the desktop PC and finally into our pockets, today’s smartphone.
Once the internet connected these millions of small computers, we needed “Cloud” data centers to store our photos and emails. Not as physically massive as AI data centers, these were still largely “filing cabinet” centers—they stored data and returned it to us when we asked for it.
WHY “MASSIVE” DATA CENTERS NOW?
If the ‘Cloud’ is a digital filing cabinet for storing our photos, an AI center is a refinery. It takes raw, unorganized data and ‘smelts’ it into intelligence. But this refining process is resource-intensive; it requires specialized Graphics Processing Units (GPUs) running at a fever pitch, demanding a concentration of power and cooling that the old mainframe rooms of the 70s could never have imagined.
To train AI to speak aloud or detect a tumor in a medical application, it must process trillions of pages of data. This requires thousands of specialized GPUs to run nonstop for months, consuming far more electricity and generating more heat than regular servers. AI centers need dedicated power and large amounts of cooling water. The complex math demands that computers be physically close together, creating massive, concentrated units rather than a spread-out network.
Some critics often compare the current AI “Gold Rush” to the Dot-com Bubble of the late 1990s. While the Dot-com bubble saw speculative IPOs and venture capitalists, the AI boom is funded by the massive cash reserves of companies like Google, Microsoft, and Amazon. Most Dotcom companies had no clear path to profitability; AI companies are already among the world’s most profitable. They are betting upwards of $700 billion this year alone that the AI Data Center is the new foundation of the global economy.
THE BOTTOM LINE
The debate in Hermantown is a microcosm of a global struggle. While some tout the economic promises of the Google project, a grassroots movement called Stop the Hermantown Data Center has gained significant traction.
The controversy isn’t just about the buildings; it’s about the perceived “erasure” of local transparency. Critics argue that when public officials are barred from discussing a 400-acre project with their constituents, the democratic process isn’t just bypassed—it’s broken.
On one side stands an integrated technological behemoth, whose reach spans every continent, with an insatiable appetite for electricity, cooling water, and human attention, to power the world’s next great invention. On the other side, local citizens are asking a simple, age-old question: At what cost does progress come to our backyard?
For the first time since the Industrial Revolution, we are building factories not to forge steel or mill timber, but to manufacture intelligence itself. The question for the North Shore is whether we are prepared for the heat that comes with it.



