Policy & Regulation 10 min read

The Policy Backlash Against AI Data Centers Has Arrived

J

Jared Clark

April 03, 2026


For years, the buildout of AI infrastructure has moved faster than the policy frameworks designed to govern it. Hyperscale data centers have materialized in rural counties, desert communities, and river corridors with relatively little public scrutiny — backed by federal permitting relief, generous tax incentives, and the broadly accepted narrative that AI infrastructure equals economic progress.

That narrative is now being challenged in a serious, organized way.

The AI Now Institute recently published its North Star Data Center Policy Toolkit, a comprehensive guide designed to equip state and local governments with concrete legislative and regulatory tools to slow, restrict, or condition the expansion of AI data centers across the United States. The toolkit is not a moderate reform document. It is an explicit challenge to the assumption that data center development is inherently good for communities — and it arrives at a moment when the political and environmental conditions are ripe for that challenge to gain real traction.

Anyone building, operating, funding, or depending on AI infrastructure needs to understand what this toolkit says, why it matters, and where it is likely to go.


What the North Star Toolkit Actually Proposes

The AI Now Institute frames its toolkit around a central premise: that hyperscale data centers — the kind that power large language models, cloud AI services, and frontier model training — maintain an extractive relationship with the communities that host them. In their framing, these facilities deplete scarce natural resources (particularly water and land), pollute local environments, drive up energy costs, increase reliance on fossil fuels, and deliver far fewer local jobs than their footprint would suggest.

The toolkit, available at ainowinstitute.org/publications/data-center-policy-guide, provides state and local policymakers with a layered menu of interventions, including:

  • Zoning and land-use restrictions that would require data centers to clear higher bars before locating in residential, agricultural, or environmentally sensitive areas
  • Environmental impact assessment mandates requiring disclosure and mitigation of water consumption, heat output, noise pollution, and grid stress
  • Energy and utility reform measures designed to prevent data centers from receiving preferential rate treatment that effectively subsidizes their energy use at ratepayer expense
  • Water usage disclosure and permitting requirements that would force facilities to account for consumptive water use in drought-prone regions
  • Moratorium provisions that would allow communities to pause new development pending assessment of cumulative environmental impacts
  • Community benefit agreements and revenue-sharing frameworks that link data center approval to measurable local investment

The document is deliberately written for practitioners — city council members, state legislators, county planning commissions — not academic audiences. That makes it operationally significant.


Why This Moment, and Why It Matters

The timing of this toolkit is not accidental. It arrives against a backdrop of rapidly escalating AI infrastructure investment that is straining energy grids and water supplies in ways that are increasingly visible at the community level.

The numbers tell a stark story. Data centers currently account for approximately 1–2% of global electricity consumption, but that figure is projected to rise dramatically. Goldman Sachs Research estimated in 2024 that data center power demand in the United States will grow by 160% by 2030. The International Energy Agency projected in its 2024 Electricity report that data centers globally could consume more than 1,000 terawatt-hours of electricity annually by 2026 — roughly equivalent to Japan's entire national electricity consumption.

Water consumption is equally significant. A single large-scale hyperscale data center can consume millions of gallons of water per day for cooling — a figure that carries acute consequences in Western states already experiencing multi-decade drought conditions.

Meanwhile, the economic promise often used to justify these facilities frequently falls short of expectations. Many modern hyperscale data centers are highly automated, employing as few as 50 to 200 full-time workers per facility despite consuming resources equivalent to small cities. That gap between resource extraction and local job creation is at the core of the AI Now Institute's "extractive relationship" framing — and it resonates politically across the ideological spectrum.


A Comparison of Policy Approaches: Reform vs. Restriction

The North Star toolkit positions itself toward the more restrictive end of a policy spectrum that communities and states are actively navigating. It is worth mapping that spectrum clearly.

Policy Approach Mechanism Primary Goal Industry Impact
Tax incentive reform Eliminate or cap data center tax breaks Reduce public subsidy of private infrastructure Moderate — raises cost of expansion
Environmental disclosure mandates Require water, energy, and emissions reporting Transparency and accountability Low-to-moderate — operational compliance cost
Zoning and siting restrictions Restrict location near water sources, schools, homes Protect community health High in specific geographies
Rate structure reform End preferential utility pricing for data centers Protect ratepayers from cost-shifting High — directly affects operating economics
Moratoriums Pause new approvals pending impact assessment Community protection and deliberation Very high — halts development pipeline
Community benefit agreements Require investment commitments as approval condition Local economic equity Moderate — adds cost and negotiation

What distinguishes the North Star toolkit is that it does not advocate for one tool — it advocates for deploying multiple tools simultaneously, layering restrictions in ways that make large-scale data center development materially harder and more expensive in participating jurisdictions.


Where Policy Action Is Already Happening

The toolkit is not operating in a vacuum. Across the United States, a wave of state and local actions has already begun to reshape the data center landscape:

  • Virginia, which hosts the world's largest concentration of data centers in Northern Virginia's "Data Center Alley," has seen growing legislative debate over grid reliability, ratepayer costs, and community impact disclosure requirements.
  • Texas legislators have introduced bills targeting the grid impact of cryptocurrency and AI computing facilities following stress events on the ERCOT grid.
  • Arizona and Nevada, both major data center markets with acute water scarcity, have seen local governments impose or consider water-use restrictions that specifically affect data center cooling operations.
  • Georgia has seen community organizing against data center expansion in areas near Atlanta, focused on environmental justice concerns about noise, pollution, and land use.

These are not fringe movements. They are local democratic responses to infrastructure development that communities feel has been imposed on them rather than negotiated with them.


The Deeper Argument the Toolkit Is Making

Beyond the specific policy tools, the North Star document makes a broader ideological argument that deserves serious engagement: that the default assumption of AI infrastructure development — that it is a public good that deserves public support and streamlined approval — should be replaced with a more skeptical default that requires developers to demonstrate community benefit.

This is a meaningful shift in framing. For most of the past decade, the dominant policy conversation around AI and technology infrastructure has been about enabling development: faster permitting, more grid access, expanded tax incentives, federal land grants. The North Star toolkit represents an organized attempt to introduce friction into that process — deliberate, democratically-grounded friction in the form of disclosure, assessment, negotiation, and in some cases prohibition.

Whether one agrees with the toolkit's conclusions or not, its intellectual seriousness demands engagement. The AI Now Institute is not a fringe organization. It is one of the most influential AI policy research bodies in the world, with deep connections to policymakers, civil society organizations, and academic institutions. When it publishes a practitioner toolkit of this scope, it intends to see it used.


What This Means for Organizations Building on AI Infrastructure

For leaders of organizations that rely on AI services, build AI-powered products, or make infrastructure investment decisions, the North Star toolkit signals something important: the regulatory environment for AI infrastructure is entering a new phase.

This doesn't mean data centers are going away. Demand is too strong and the economic incentives too powerful for any set of state and local policies to stop hyperscale buildout at the national level. But it does mean several things are likely to shift:

Geographic concentration risk increases. If a meaningful number of jurisdictions adopt elements of the North Star framework, data center development will be pushed toward more permissive geographies — concentrating infrastructure in areas with weaker regulatory protections and potentially greater environmental vulnerability.

Development timelines will lengthen. Even where moratoria or comprehensive impact assessments don't result in denial, they add time and cost to the development pipeline. For AI model training and inference infrastructure that operates on tight timelines, that has real operational consequences.

Operating costs will rise in some markets. Rate structure reform — one of the toolkit's most consequential proposals — could meaningfully increase the energy cost of data center operations in states that adopt it. That cost will flow through to cloud pricing, AI service costs, and ultimately enterprise AI budgets.

Community relations become a strategic variable. Organizations that have treated data center siting as a purely technical and financial exercise will need to develop genuine community engagement capabilities. The political conditions that make the North Star toolkit viable are the same conditions that make community opposition a material development risk.


The Questions the Toolkit Leaves Open

The North Star framework is clear about what it opposes. It is somewhat less developed on questions that will be critical to its real-world implementation:

How do we meet growing AI demand while restricting its infrastructure? The toolkit does not offer a strong answer to where the compute for AI systems should come from if not large-scale data centers. More efficient architectures, nuclear energy, and distributed computing are sometimes gestured toward, but the demand-supply tension is real.

Who bears the transition costs? If restrictive policies in some jurisdictions raise operating costs and drive investment elsewhere, the communities that adopt those policies may forgo tax revenue and economic activity without necessarily improving their environmental outcomes if less regulated markets pick up the slack.

What is the governance mechanism for genuinely regional or national infrastructure? Local control is a powerful democratic value, but AI infrastructure is national and global in its effects. A county zoning board is not well-positioned to weigh the tradeoffs between local water use and national AI capacity — and the toolkit does not fully resolve how that tension should be managed.

These are not arguments against taking the toolkit seriously. They are reasons why the policy conversation it is opening will need to be more complex than either unfettered development or comprehensive restriction.


A Transition Point Worth Watching Closely

The North Star Data Center Policy Toolkit is best understood not as the final word on AI infrastructure governance, but as a signal that a genuine policy transition is underway. The period of largely uncontested AI infrastructure expansion is ending. What replaces it — a negotiated framework that balances development with democratic accountability, or an increasingly contested landscape of local restrictions — depends heavily on whether the AI industry engages seriously with the concerns the toolkit raises.

The most important thing to understand is this: the communities and policymakers reaching for the North Star toolkit are not uniformly opposed to technology or economic development. Many of them are responding to real environmental and economic grievances that the industry has not adequately addressed. The toolkit gives those grievances institutional form and legislative language.

That is a qualitatively different challenge than lobbying against a federal regulation. It is a challenge that plays out in hundreds of county commissions, state legislatures, and utility boards simultaneously — and it will not be resolved by a single policy response or a single industry coalition.

If you want to understand where the governance of AI infrastructure is heading, the North Star toolkit is essential reading. You can access the full document at ainowinstitute.org/publications/data-center-policy-guide.

For broader context on how AI is reshaping institutional and civic life, explore more analysis at prepareforai.org.


Last updated: 2026-04-03

Jared Clark is the Founder of Prepare for AI, a thought leadership platform exploring how AI transforms institutions, work, and society.

J

Jared Clark

Founder, Prepare for AI

Jared Clark is the founder of Prepare for AI, a thought leadership platform exploring how AI transforms institutions, work, and society.