WHY DATA CENTER PROJECTS DIE

82 projects. 68 resolved outcomes. The first structured analysis of what kills data center developments and what saves them.

1. THE SIGNAL

Blocked projects average 36.7 on our risk score.

Survived projects average 30.0.

Gap: 6.7 points

That gap is modest. Earlier versions of our model inflated the separation by accidentally encoding outcome into the score. We fixed it. Six points is the honest signal.

But the risk score is not the story. The strongest discriminator in the dataset is not any opposition characteristic. It is whether the developer responded with structural concessions.

Projects do not die from opposition.
They die when opposition meets developer indifference.

2. WHAT KILLS PROJECTS

Four factors distinguish blocked projects from those that survived. All are DC-specific. None are ported from the transmission literature.

Water Consumption

Cited in 51% of blocked projects

Cited in 35% of survived projects

The number one opposition argument in blocked DC projects. No transmission parallel exists. In arid regions, water approaches veto-level concern. Tucson's city council voted unanimously to reject Amazon's Project Blue. Aquifer depletion was the primary vector.

Noise and Residential Proximity

Noise cited in 38% of blocked vs 15% of survived

Not a standalone kill factor. Noise amplifies residential proximity concerns. People cite it last on their list, but it makes the abstract real: 24/7 industrial humming behind a house. The anticipation may matter more than actual decibel levels. Nobody opposing a project has lived next to one yet.

The Scale Mismatch Ratio

Using Census population data, we measured project footprint relative to community size.

Acres per 1K Residents Outcome Ratio Interpretation
20+ acres / 1K 7:1 blocked Overwhelming community footprint. Nearly always fails.
1 to 20 acres / 1K Contested Counter-strategies determine outcome in this range.
Under 1 acre / 1K 8:5 survived Project is proportional to its host community.
Scale Mismatch: Two Examples

Asheville Village, OH

680 acres. Population: 359. That is 1,894 acres per 1,000 residents. Rejected.

Maricopa / Buckeye, AZ

2,000 acres. County population: 4.5 million. That is 0.44 acres per 1,000 residents. Approved.

The absolute size of the project did not determine the outcome. The ratio to the host community did.

We do not know the exact threshold. The data suggests a gradient, not a bright line. But if your footprint exceeds 20 acres per 1,000 local residents, history says you need significant structural concessions to survive.

Developer Opacity

Big Tech identity is not a risk factor. Google, Amazon, Microsoft, and OpenAI each have both blocked and approved projects in the database. The risk comes from secrecy, not from the brand on the building.

3. WHAT SAVES PROJECTS

81% vs 28%
81% of survived projects had documented counter-strategies.
28% of blocked projects did.

Community Benefit Agreements in blocked projects: 0

Community Benefit Agreements in survived projects: 3

Saline MI, West Des Moines IA, Henrico/QTS VA

Not all counter-strategies are equal.

WHAT WORKS

  • CBAs with enforceable terms
  • Scale reduction or design concessions
  • Zoning compromise frameworks
  • Genuine, sustained community engagement

WHAT DOESN'T

  • Generic "economic development" talking points
  • Money without process
  • Secrecy followed by late concessions
  • Reactive CBAs after opposition has formed

"Economic development" was cited by blocked and survived projects alike. Everyone says it. It distinguishes nothing. The PACE of Trust roundtable (ACEG/DNV, January 2025) found that communities see unearned benefits as "trinkets, not investments."

"The pitch difference between what works and what doesn't is specificity and real commitments. But the big thing is trust: how the developer has dealt with the community from the first interaction to the signing of a CBA. That doesn't show up in the data."

Stephen Nittler

The PACE of Trust report, developed through a multi-stakeholder roundtable including Invenergy, Earthjustice, IBEW, and the Nature Conservancy, found the same pattern: the speed of project development is commensurate with the level of trust built with affected communities. Process quality matters more than the dollar figure on the benefit package.

Case Study

Saline, Michigan

Risk score: 51 (HIGH). Counter-strategy score: 12. Survived.

$14M community benefit fund. Closed-loop cooling system. Farmland preservation commitments. 2,500 union construction jobs. The developer did not wait for opposition to form. The CBA was part of the initial proposal, not a reaction to resistance.

This is the only project in the HIGH risk tier with a clean survival in our dataset.

4. THE GEOGRAPHIC PICTURE

Documented events. Not model outputs.

Indiana Wave

Nine counties enacted moratoriums or bans between November 2025 and March 2026. The largest coordinated state-level wave in the database. Agricultural counties watched neighbors get developed and mobilized preemptively. Marshall County is pursuing a permanent ban. Clinton County unanimously denied a 600 MW, $10B+ project.

Northeast Tennessee

BrightRidge, a publicly owned utility serving 85,000 customers, imposed the first utility-level moratorium on serving new data center customers in May 2025. City and county moratoriums stacked on top. No zoning workaround exists when the power company will not connect you.

North Carolina

25 cancellations in 2025. That is four times the 2024 rate. Tarboro, Apex, Gates County, Chatham County, Canton.

Virginia Litigation Belt

King George County officials said they are "ready to go to war" against Amazon's $6B proposal. Warrenton has three parallel lawsuits. PW Digital Gateway: QTS and Compass face litigation from the American Battlefield Trust, now before the VA Court of Appeals.

Texas Preemption Dynamic

Hood County voted to reject its own moratorium because the state might not let counties impose them. Not actual preemption, but the threat of it changed local government behavior.

5. THE CROSSOVER

Data center opposition uses the same playbook as transmission opposition. The arguments transfer. The kill mechanisms do not.

What Transfers

What Does Not Transfer

DimensionTransmissionData Centers
Primary kill mechanism Route modification (88% success rate) Water, scale mismatch, zoning denial
Counter-strategy Route modification, corridor selection CBAs, scale reduction, zoning compromise
Timeline to kill 8+ years Months
Novel attack vectors None (established playbook) Water consumption, utility moratoriums

Susskind 2022, Vajjhala 2007, and Scott 2023 analyze generation and transmission opposition. Data Center Watch and trade press cover DC opposition as news. Nobody has connected the two literatures. This project is the first structured attempt to apply opposition risk analysis across both domains.

The transmission model has 38 projects with 11 years of data. The DC model has 82 projects with roughly 2 years. The DC findings are preliminary.

6. WHAT WE DON'T KNOW

1. The Intensity Coding Problem

Our opposition intensity ratings were researcher-assigned and may be influenced by outcome knowledge. We are recoding all projects using an observable-criteria rubric. Results forthcoming.

2. Counter-Strategy Cause vs. Correlation

Survived projects have more counter-strategies. But we cannot prove CBAs cause survival. It could be that well-resourced developers (who happen to survive for other reasons) are also more likely to implement CBAs. The sample is too small for causal inference.

3. The Dark Matter of Opposition

We only track publicized fights. Projects that quietly die before making the news, or projects that succeed with no opposition, are not in the database. This is selection bias toward documented conflict.

4. Process Quality Is Invisible

The most important variable (according to both our data and the PACE of Trust report) is how the developer engages with the community. We can measure structural outputs (CBA exists or does not exist) but not process quality (was the developer respectful, accessible, responsive?). This is an inherent limitation of any database approach.

5. Small N, Big Claims

82 projects, 68 resolved. Any finding from this dataset should be treated as a hypothesis, not a conclusion.

7. METHODOLOGY

Each site-specific project is scored on 11 factors. A separate counter-strategy response score tracks developer behavior. The two scores are intentionally independent: risk measures the environment, response measures the reaction.

Observable criteria are prioritized over subjective ratings. Population-based scale mismatch replaced the original absolute-size metric. Factor weights are hypothesis-based, not empirically calibrated.

The database tracks 82 site-specific projects and 51 legislative/regulatory actions (moratoriums, bills, regulations). These are separate categories. A moratorium is not a project.

8. ABOUT THIS RESEARCH

Three years on a $7B HVDC transmission project across four states. Senior Associate, Transmission Public Affairs at Invenergy. $2M+ public affairs budget. Four years as a registered lobbyist in energy and tech. I watched opposition kill projects that had every technical and economic argument in their favor.

This database exists because I wanted to understand why. When the same patterns showed up in data center development, I started tracking those too.

This is not advocacy. It is pattern recognition applied to infrastructure siting. Developers, policymakers, and communities all benefit from the same honest data.

THE DAILY MINE

Daily intelligence on energy infrastructure, data center politics, and the collision of AI demand with grid reality. 7 AM ET.

Subscribe
Tracker Methodology Map

This analysis is an independent editorial resource compiled from public sources including news reports, court documents, regulatory filings, and public records. It is not exhaustive and does not constitute legal, investment, or professional advice. Data reflects available information as of March 2026.