Sponsored

The $650 billion AI infrastructure buildout the United States announced for 2026 has a supply chain problem that no amount of GPU orders can solve. It involves copper, steel, and electrical equipment that takes up to five years to deliver — and roughly 60% of global supply comes from China.

Nearly half of all U.S. data centers planned for 2026 — approximately 7 gigawatts of the 12 GW announced — have already been canceled or delayed. Only about one-third of announced 2026 capacity is currently under active construction. The reason is not a shortage of ambition or capital. It is a shortage of power transformers.

The Transformer Problem

Power transformers are the unglamorous infrastructure bottleneck of the AI era. Every data center requires large, custom-engineered transformers to step utility voltage down to the levels that server racks can use. These are not commodity items. Lead times that stretched 24 to 30 months before 2020 have now extended to as long as five years for high-capacity units. AI deployment cycles run under 18 months. The math does not work.

The mismatch is structural. Transformer manufacturing is an industrial process that cannot be scaled quickly. It requires specialized steel, engineered copper windings, long-curing insulation processes, and skilled trades that are not easily automated. No hyperscaler can fix this with a software update or a new chip architecture.

U.S. domestic production covers only approximately 20% of the large power transformers the country needs. The remainder is imported — predominantly from China, which controls roughly 60% of global supply. The two largest Chinese suppliers, TBEA and China XD Group, are reportedly booked through at least 2027.

The Tariff Complication

The situation was already constrained before April 2026. The new U.S. tariff regime has made it considerably worse.

Copper, the primary conductor material in transformer windings, now carries a 50% tariff under the April 2026 duties. Unlike semiconductors — which received targeted exemptions intended to protect the chip supply chain — power equipment and its raw material inputs received no such carveout. The tariffs that were designed in part to reduce dependence on Chinese manufacturing are, in the short term, raising the cost of the only supply chain capable of meeting U.S. data center demand.

The irony is precise: America’s effort to decouple from China in strategic industries is temporarily increasing the cost of the Chinese-made equipment that American data centers depend on, while domestic manufacturing capacity remains years from filling the gap.

What Is Actually Getting Built

The scale of the slowdown is significant even by the standards of an industry accustomed to delays. Of the roughly 12 GW of U.S. data center capacity announced for 2026, the canceled and delayed projects skew heavily toward the speculative tier — capacity announced to attract hyperscaler commitments or demonstrate site readiness, but not yet tied to firm construction contracts.

The projects that are actively building tend to be those with committed anchor tenants, existing utility interconnection agreements, and transformer orders placed well in advance. The companies that moved early on infrastructure procurement — placing transformer orders in 2023 and 2024 — are insulated. Those that waited for lease commitments before ordering equipment are now facing years-long queues.

The Strategic Exposure

What the transformer shortage reveals is a specific class of vulnerability that is distinct from the semiconductor debate. Chips are complex, miniaturized, and require specialized fabs to produce. The political and industrial case for domestic chip manufacturing is well-established and has attracted significant federal support through the CHIPS Act.

Large power transformers are not miniaturized. They are heavy industrial products that the United States manufactured domestically for most of the twentieth century. The erosion of that capacity was a function of cost optimization over decades — and it has left the AI infrastructure buildout exposed to a supply chain risk that is simultaneously simpler and harder to fix than the chip shortage.

Simpler, because the technology is not exotic. Harder, because rebuilding industrial manufacturing capacity takes longer than building a fab.

The Inflation Reduction Act has directed some investment toward domestic energy infrastructure, and the transformer shortage has drawn attention from the Department of Energy. But the timeline for meaningful domestic capacity expansion is measured in years — not months — and the data centers that need power interconnections in 2026 and 2027 are running out of room to wait.

What Happens Next

The immediate consequence is that the AI infrastructure spending numbers cited by hyperscalers — Microsoft’s $80 billion, Amazon’s $105 billion, Google’s $75 billion — will not translate into online capacity at the pace the press releases imply. Some of that capital will be deployed. Much of it will be queued against infrastructure that doesn’t yet exist.

The medium-term risk is competitive. China’s AI infrastructure buildout does not face the same transformer bottleneck; domestic supply is integrated into the buildout in ways that U.S. programs cannot replicate quickly. If the U.S. data center pipeline remains constrained through 2027 and 2028, the compute capacity gap between the two countries narrows in ways that benchmark comparisons don’t capture.

Big Tech can keep announcing. The transformers will arrive when they arrive.

L
Lois Vance

Contributing writer at Clarqo, covering technology, AI, and the digital economy.