Watt-Bit Spread: The Term Behind The Economics of the AI Frenzy
The “Watt-Bit Spread”: When AI’s Power Hunger Meets the Electric Grid
Every new era of energy has its metrics. During the natural gas boom of the 2000s, analysts watched the spark spread- the difference between the cost of fuel (an MMBtu of gas) and the value of the electricity it produced ($/MWh). Today, in the age of artificial intelligence, a new metric has emerged: the watt-bit spread. Coined by former Microsoft energy chief Brian Janous, the watt-bit spread captures a “fundamental disconnect between the cost of a watt and the value of the bits created by those watts.” In simple terms, the watt-bit spread is the gap between the cost of electricity and the value of computing power. And that gap is widening fast.
A Seismic Surge in AI Power Demand
Over the past two weeks, we have explored the insatiable appetite for electricity that AI has, whether it involves the surge in electricity demand or how it may be met with SMRs. Data centers powering AI models are expanding in size and number.
This growth is drastic compared to past energy growth. For perspective, natural gas' share of US power generation increased from 17% to 40% between 2000 and 2020, representing a 2.3-fold market jump over 20 years. By contrast, the AI-related load is rising so rapidly that utilities are inundated with large interconnection requests immediately. “It’s like nothing we’ve ever seen, not since the early 1900s,” said AES energy executive Chris Shelton, noting his company alone has ~8GW of data centre projects in queue and expects that to double by 2030. Nationwide, a JLL survey found 22 GW of new data centers already planned in the US, with most slated to come online between 2027 and 2030. The bottleneck is securing enough power, fast enough, to feed these digital behemoths.
From Spark Spread to Watt-Bit Spread: What’s The Difference?
So what exactly is the watt-bit spread? Analogous to the spark spread for a gas power plant (the delta between fuel cost and electricity value), the watt-bit spread measures the delta between the price of electricity (“watt”) and the economic value of the computation (“bit”) that can be generated from that electricity. In an AI data center, that conversion of watts to bits yields tremendous value. “I don’t know that there’s any energy conversion that creates a greater return than turning an electron [watt] into a bit,” Janous told Latitude Media. A given quantity of electricity can produce AI services or insights worth many times the cost, meaning AI companies’ willingness to pay for power far exceeds the utility’s asking price. In other words, the watt-bit spread is the hefty profit margin created by buying electricity at the given rate (regulated or unregulated) and selling AI services at market value.
Right now, that spread is very high. The value of an extra megawatt to an AI firm (enabling more GPU clusters, larger model training, and faster AI products) greatly outweighs its cost. Every hyperscaler is now chasing more power capacity; “more electronics means more powerful AI models,” and a more competitive edge (speed).
Time to Power as the New Currency
A crucial insight of the watt-bit spread theory is that timing is everything. In today’s market, a megawatt delivered in 2027 is worth far more than one delivered in 2032. Why? Because in the fast-moving AI race, being able to plug in GPUs now (not years from now) can make or break a company’s advantage.
A simple way to understand this concept is to revisit Finance 101 and the time value of money: a dollar today is worth more than a dollar tomorrow because it can be invested, earn interest, or be deployed immediately. Now, replace money with energy for compute, the same principle applies. Energy available now is more valuable than energy later because it can be used to run time-sensitive workloads, capture high-value compute demand, or respond to real-time inference needs (which of course leads to cash flow). Delayed access to energy means missed opportunities in both cash flow, performance and market capture.
As Janous puts it, the value of capacity is highly time-sensitive - power today enables AI services and market capture that a delay of a few years could forfeit.
This highlights a stark mismatch in perspective between the tech world and the utility world. AI companies operate on a “theory of constraints” model - they will invest aggressively to eliminate bottlenecks and maximize throughput, even if it means holding excess capacity. Currently, electricity is the constraint, so they are willing to pay a premium to get it sooner. Utilities, however, traditionally follow a “lean” or just-in-time model, minimizing unused capacity, and expanding slowly and cost-efficiently under strict regulatory oversight. There’s no built-in reward for rushing infra to market ahead of schedule. This clash (AI’s time-sensitive demand vs. the grid’s slower cadence) is a core driver of the watt-bit spread imbalance.
Consequences of a Growing Spread: Tension at the Grid Edge
The Watt-Bit Spread’s high-value signal has set off an arms race for electrons – and a host of consequences, both intended and unintended:
Building Frenzy (and Uncertainty): Hyperscalers (such as Google, Microsoft, and Meta) and cloud providers are racing to develop new data centers, but face a classic dilemma: How much is too much? On paper, if you sum up all announced projects, the global industry could be spending on the order of $600 billion in this build-out wave. This is what Sequoia Capital’s David Cahn dubbed “AI’s $600B question” – will the AI revenue materialize to justify these investments, or are we in a hype-driven overshoot? Some industry analysts see a growing “hole” between CapEx and actual AI revenue – estimated at approximately $125B per year and widening to $500B – which implies that many data center assets could be underutilized if demand doesn’t meet lofty expectations. In the near term, however, every major cloud player appears convinced that not building enough would be the greater mistake (a “land grab” mentality). This has even led to new entrants and speculative developers queuing up projects – contributing to bloated interconnection queues at utilities, many of which are aware that not all of these projects will be built.
Grid Constraints and Delays: Power companies, for their part, are sounding alarms that the traditional pace of grid expansion can’t keep up. “The biggest bottleneck is the interconnection process,” explains AES’s Shelton. Simply put, connecting a large new load or power plant to the grid requires studies, upgrades, and often new transmission lines or substations – processes that currently take years to complete. Regional queues for new generation are backlogged, with projects waiting 1-4+ years for permission to connect. AI data centers, paradoxically, require both ultra-reliable power and clean energy, which often means new generation sources (such as solar farms or wind-plus-storage) dedicated to them. That only adds to the delay. EPRI (Electric Power Research Institute) warns that 1-2 year connection lead times and demands for carbon-free power are already creating “local and regional electric supply challenges” in hotspots. Some utilities are raising the spectre of potential blackouts or reliability risks if large loads come online faster than the infrastructure can be built.
Bridging the Gap: How to Align “Watts” and “Bits”
Is there a way to resolve this tension – to supply AI’s voracious power needs without blowing up the grid or unfairly burdening others? Experts are coalescing around a multi-pronged approach:
Grid-Enhancing Technologies and Flexible Solutions: Not every solution involves pouring concrete for new power plants. Many experts stress that we should first squeeze more capacity out of existing infrastructure – especially given that data centers, while large consumers, don’t draw their maximum capacity 24/7. “It’s often a capacity constraint, not an energy constraint,” notes Peter Freed, former director of strategy at Meta. The peaks tend to occur on the hottest afternoons or cold snaps when both data centers and other loads are high simultaneously. To manage these spikes, we can deploy Grid-Enhancing Technologies (GETs), such as dynamic line rating sensors that allow transmission lines to carry more power when conditions permit, or modular FACTS devices that route power efficiently. Even strategically placed batteries on the grid can act as “shock absorbers” – supplying bursts of power to cover peak periods or new loads until longer-term upgrades catch up. “This is where grid-enhancing technology shines,” says Freed, pointing to simple fixes like reconductoring lines with higher-capacity wires before defaulting to building brand new lines. The US Department of Energy is heavily promoting virtual power plants (VPPs) and demand flexibility for precisely this reason.
Massive Investment in Clean Energy: Ultimately, meeting AI’s electricity demand sustainably will require significantly more carbon-free generation – and this needs to happen quickly. The window is tight: as Freed notes, even optimistic timelines for scaling up advanced nuclear, geothermal, and long-duration storage put meaningful capacity only by the early 2030s. In the meantime, Big Tech is directly financing new clean energy projects at an unprecedented scale. Tech firms accounted for over half of US corporate renewable procurement in recent years. A prime example is Google’s $3 billion framework agreement with Brookfield Renewable, announced in July 2025, to secure up to 3,000MW of dispatchable hydropower capacity across North America. The deal is structured to match Google’s data center demand with 24/7 carbon-free energy on a regional basis, starting in Quebec, where a portion of this power is already online and delivering electricity. Unlike traditional procurement, this agreement ties hourly electricity usage to clean, local generation, setting a new standard for AI infra. By keeping the assets grid-facing rather than behind-the-meter, these types of deals support data center reliability, broader grid stability, and community benefit.
Policy and Regulatory Evolution: Ultimately, none of this would happen without effective policy. Regulators are now re-evaluating decades-old rules in light of the AI surge. Many utility commissions are asking: do we have the tools (like integrated resource planning that accounts for large new loads, or flexible load interconnection rules) to handle this? Some are considering special contract structures or “energy-as-a-service” models, where a utility might partner with a data center developer to build needed infrastructure with a guaranteed offtake. There’s also an emphasis on forecasting realism – scrutinizing whether all the AI megawatts in the queue will truly materialize, so we neither underbuild nor massively overbuild. States like Virginia are exploring new business models (one co-op proposed a separate tariff class for large data centers), and we may see requirements for higher financial collateral from speculators holding queue positions. The regulatory compact is clear that existing customers should be protected – but also that beneficial load growth (like data centers) can reduce rates for everyone if done right. The trick is ensuring the newcomers pay their fair share for reliability and upgrades, while also leveraging their presence to modernize the grid for the future.
The Bigger Picture: Preparing for a High-Load Future
Beneath the turbulence, there’s a silver lining: the AI data center boom is forcing us to confront challenges the grid would have faced eventually. Even aside from AI, electrification of transportation, heating, and industrial processes is driving load growth after decades of stagnation. Data centers may be the “first mover” of a new era of high electricity demand, stress-testing our permitting processes, infrastructure, and regulatory frameworks. “Data centers are part of a larger load growth picture,” notes Freed, “but even under the most aggressive scenarios, data centers are a low double-digit percentage of the whole.” In other words, they alone won’t break the grid – but they highlight where the cracks are.
If we rise to the occasion, today’s investments and innovations to handle AI loads could pave the way for a more robust, flexible grid for all. As one Rocky Mountain Institute analysis argued, data centers could actually help utilities justify grid upgrades that will be needed for EVs and electrification broadly. The Watt-Bit Spread has made electricity suddenly “cool” (or existential) in boardrooms that previously ignored it – turbocharging corporate focus on energy procurement, innovation, and sustainability. That means capturing some of the economic value that AI is willing to pay for power and using it to build a cleaner, stronger grid that benefits everyone.
Will the Watt-Bit Spread stay sky-high forever? Unlikely. Market equilibrium will eventually reassert itself – either through massive supply build-out by the 2030s or a tapering of demand growth (or both). AI compute efficiency is already improving (e.g. Nvidia’s next-gen chips promise 2.5X performance for only 25% more power), and top AI engineers are now intensely focused on optimization rather than just scale. “I think we will blow algorithmic efficiency through the roof,” predicts Freed, suggesting AI’s energy curve may bend down later in the decade rather than grow exponentially forever. If and when the Watt-Bit Spread narrows – i.e. power is no longer the scarce lynchpin – the industry will shift its focus to the next constraint.
But in the here and now, the Watt-Bit Spread is a useful lens to understand why data centers are scrambling for power, why utilities are overwhelmed, and what’s driving each side’s behaviour. It’s a reminder that energy and information are deeply intertwined. To keep the AI revolution going, we must align the economics of watts and bits. That means rethinking how we value electricity – not as a cheap commodity to be taken for granted, but as a strategic resource worthy of investment, innovation, and yes, higher prices where it delivers higher value. The AI era, for all its challenges, offers an opportunity to modernize the grid faster than any clean-energy advocate could have hoped. The genie (or “bottle,” as one regulator quipped) is well and truly out – and our collective wish should be to harness this moment to build an electricity system that can power whatever the future holds, AI-driven or otherwise, reliably and sustainably for all.