The AI boom has a power problem. It is not a minor operational inconvenience. It is a structural constraint on the pace of AI development that threatens to become the primary bottleneck of the entire industry within five years. Understanding this constraint — its scale, its causes, and the technical approaches that could solve it — is essential context for any serious investor in deep technology in 2024 and beyond. It is also the lens through which Estes Capital views its investments in next-generation energy infrastructure.
This essay makes three arguments. First, the power demand from AI training and inference is growing at a rate that the existing electricity infrastructure cannot support, and the timeline for resolution of this constraint is longer than the mainstream technology press acknowledges. Second, the solutions to this constraint are not primarily in the hands of the existing electricity grid or the incumbent utilities — they are in the hands of companies developing fundamentally new approaches to power generation and clean energy supply. Third, the investment opportunity created by this convergence is one of the most compelling in deep technology today, and the companies that capture it will be among the most important infrastructure businesses of the next twenty years.
The Scale of AI's Energy Appetite
The numbers that describe AI's energy consumption are genuinely extraordinary, and they are growing faster than most people appreciate. Training a single large language model of GPT-4's scale consumed an estimated 50 gigawatt-hours of electricity — roughly equivalent to the annual energy consumption of 4,500 average US households, all consumed in a period of weeks. That figure represents the training cost for a single version of a single model. The major AI labs run dozens of training runs in parallel, and the scale of models being trained in 2024 is substantially larger than GPT-4.
Goldman Sachs published research in mid-2024 estimating that data center electricity consumption would grow from approximately 200 terawatt-hours per year in 2023 to over 1,000 terawatt-hours per year by 2030 — a fivefold increase in seven years, driven primarily by AI workloads. To put that in context, the current annual electricity consumption of Germany is approximately 490 terawatt-hours. The projected growth in data center power demand alone, over the next seven years, is equivalent to adding roughly one Germany to global electricity consumption.
The inference workload — the electricity consumed by running AI models to serve user queries, rather than the electricity consumed by training them — is growing even faster. ChatGPT's daily query volume in 2024 was estimated at over 10 million queries per day, each of which requires orders of magnitude more computation than a traditional Google search. As AI-powered applications proliferate — in healthcare, legal, financial services, software development, and customer service — inference compute will scale with the number of active users and the frequency of their interactions, which is a far larger growth curve than training alone.
The physical manifestation of this demand growth is visible in real estate and utility markets. Hyperscaler data center campus projects in northern Virginia, Phoenix, and the Midwest have encountered electricity interconnection queues that extend 5–7 years. Dominion Energy, the primary utility serving the "Data Center Alley" corridor in northern Virginia, has publicly stated that it cannot connect new large loads in its service territory without new transmission infrastructure investment. Microsoft, Google, and Amazon have all disclosed that energy availability, rather than land, capital, or hardware supply, is the primary constraint on their data center expansion plans.
"The electricity grid was not designed for the data center load profile. Conventional power procurement — offtake agreements, utility connections, grid-tied generation — cannot scale fast enough to match the AI industry's growth rate. The companies that solve the power problem are not solving an operational issue; they are solving the infrastructure bottleneck of the AI economy."
Why the Existing Grid Cannot Simply Scale
The instinctive response to a supply constraint is to suggest that the existing infrastructure should simply build more capacity. In the case of electricity infrastructure, this response fundamentally misunderstands the nature of the constraint. The barriers to rapid grid expansion are not primarily financial — they are regulatory, physical, and temporal.
Transmission infrastructure — the high-voltage lines that move electricity from where it is generated to where it is consumed — has a lead time of 10–15 years from planning to operation, including permitting, environmental review, and construction. The United States has approximately 700,000 miles of transmission lines, the majority of which were designed and built before the digital economy existed. Upgrading this infrastructure to handle the concentrated, continuous loads that large data centers require is not a matter of incremental investment — it requires fundamental redesign of regional grid architecture.
Solar and wind generation, the primary additions to US grid capacity over the last decade, are intermittent by nature. Data center loads are constant. A hyperscaler data center running at 100 megawatts cannot modulate its power consumption based on whether the sun is shining or the wind is blowing — the AI training runs and inference workloads do not pause for weather. Connecting large AI compute facilities to renewable generation therefore requires either substantial battery storage infrastructure (which is expensive and has its own supply chain constraints) or dispatchable backup generation (which typically means natural gas, with the associated carbon emissions).
The net result is that the growth trajectory of AI compute — which is exponential — is structurally incompatible with the growth trajectory of conventional electricity infrastructure — which is constrained by regulatory, physical, and financial factors to low single-digit percentage increases per year. Something has to give. Either the AI industry slows down, which no one expects or wants, or fundamentally new approaches to power generation emerge that are not subject to the same constraints as conventional grid infrastructure.
Electric Hydrogen and the Green Hydrogen Manufacturing Opportunity
Electric Hydrogen, founded in 2020 by David Eaglesham and Raffi Garabedian, represents one of the most interesting infrastructure-level approaches to the clean energy challenge that AI's power demand makes urgent. The company is not building a technology for AI data centers directly — it is building electrolyzer technology for industrial-scale clean hydrogen production. But the connection to the AI energy problem runs through the broader industrial decarbonisation challenge that AI's energy demand has made commercially urgent.
The thesis is straightforward: the global hydrogen economy currently produces approximately 70 million tonnes of hydrogen per year, virtually all of it via steam methane reforming of natural gas — a process that emits approximately 830 million tonnes of CO₂ per year, roughly equivalent to the entire annual emissions of Germany. This hydrogen is used primarily for ammonia production (fertiliser), petroleum refining, and methanol synthesis. Replacing grey hydrogen with green hydrogen — produced via electrolysis powered by renewable electricity — requires electrolyzer technology that is dramatically cheaper and more reliable than current commercial offerings.
Electric Hydrogen has raised $380 million, including a $198 million Series C in 2023, to build electrolyzers that operate at 95% current efficiency — a record for the technology — with a design philosophy that prioritizes manufacturing scalability over bespoke engineering. The company is targeting a levelised cost of green hydrogen of under $1 per kilogram, compared to current green hydrogen production costs of $3–6 per kilogram, by 2030. If this target is achieved, green hydrogen becomes cost-competitive with grey hydrogen at current natural gas prices, and the economic case for decarbonising the existing industrial hydrogen market becomes straightforward.
The connection to AI energy demand is indirect but real. As AI data centers push electricity demand upward, the imperative to decarbonise electricity generation intensifies — both for corporate sustainability commitments and increasingly for regulatory reasons. Clean hydrogen, produced from excess renewable electricity during periods of low grid demand and stored for later use, is one of the most viable pathways for time-shifting renewable energy supply to meet constant industrial loads. Electric Hydrogen's technology sits at the intersection of the renewable energy buildout and the industrial decarbonisation imperative that AI's energy demand makes urgent.
Oklo and the Nuclear Option for AI-Adjacent Power
No discussion of AI's energy challenge would be complete without addressing nuclear power — specifically, the emerging category of advanced fission reactors that are designed from the ground up to address the operational and economic characteristics that have made conventional nuclear power commercially uncompetitive with natural gas and renewables over the last twenty years.
Oklo, founded in 2013 by Jacob DeWitte and Caroline Cochran, went public via SPAC in 2024, backed by Sam Altman (who serves as executive chairman), and is developing a compact, liquid metal fast reactor — the Aurora — that is designed to operate continuously for years without refuelling, to be deployed in modular configurations at the 1–10 megawatt scale, and to use nuclear waste from legacy reactors as fuel. This fuel strategy — consuming waste that currently costs the US federal government hundreds of millions of dollars per year to store — inverts the conventional economics of nuclear power. Instead of requiring expensive enriched uranium, Oklo's reactor designs are intended to run on material that currently has negative value.
The strategic logic of Oklo's technology for AI data center power is compelling. A data center requires constant, high-density power that is location-independent — it needs to be delivered at the data center site, not transmitted from a distant renewable generation facility via a constrained grid. Small modular reactors, deployed at or near data center campuses, could provide exactly this: 24/7 carbon-free power at the load, without dependence on grid infrastructure or weather patterns.
Sam Altman's involvement in Oklo is not coincidental. OpenAI's compute infrastructure is one of the most energy-intensive operations in the technology industry, and Altman has been explicit in public statements that nuclear energy is the only technology currently visible that could realistically scale to meet AI's long-term power demand without either massive grid infrastructure investment or continued dependence on fossil generation. Microsoft reached a similar conclusion, signing a 20-year power purchase agreement with Constellation Energy in 2023 to restart a unit of Three Mile Island specifically to power its data center operations — the first nuclear restart driven explicitly by AI compute demand.
Oklo's path to commercial deployment has required navigating a regulatory environment — the Nuclear Regulatory Commission — that has not approved a new reactor design in decades. The company received a combined licence application rejection from the NRC in 2022, citing insufficient safety analysis, and has been working through a resubmission process since. This regulatory complexity is a barrier to entry, not just a risk — companies with the patience and expertise to navigate it will have a durable competitive advantage over future entrants. Oklo's public market status gives it access to capital at a scale that makes the extended regulatory timeline manageable, while its technical and regulatory team is arguably the most experienced in the advanced fission sector.
The Investment Thesis: Energy Infrastructure as AI Infrastructure
The AI energy convergence creates an investment opportunity that is both large in absolute terms and substantially underappreciated by the mainstream technology investment community. Most AI-focused investors are thinking about the software layer — model developers, application builders, infrastructure tooling. The hardware layer — GPUs, networking, storage — is well-covered by the hyperscalers and by the secondary market for AI compute. What is systematically undercovered is the energy infrastructure layer: the companies building the power generation, distribution, and storage technologies that will actually enable the AI compute buildout to continue.
This underinvestment is partly a matter of investor category. Energy infrastructure companies have historically been financed by infrastructure funds, utilities, and project finance vehicles — not by technology venture funds. Technology venture funds have not historically had the domain expertise to evaluate advanced fission reactors or industrial electrolyzer technology. The result is a category that is strategically critical to the technology industry but financed by capital that does not understand its strategic importance.
At Estes Capital, we view this as one of the most attractive structural mispricings in deep technology investing today. We have the scientific and engineering depth to evaluate advanced energy technologies at the level of technical rigor they require. We have the technology industry relationships to understand the demand dynamics — the actual power consumption profiles of hyperscale AI training, the realistic grid constraints, the procurement decision processes at Microsoft, Google, and Amazon. And we have the investment framework to hold the long-duration positions that energy infrastructure companies require, because we understand that the timeline from seed investment to market leadership in physical infrastructure is measured in decades, not quarters.
The Turntide Thesis: Efficiency Before Generation
Not all of the opportunity in the AI energy convergence sits on the generation side of the supply-demand equation. Turntide Technologies, which has raised $80 million to develop switched reluctance motor technology that reduces the energy consumption of industrial motors by 30–70%, represents the efficiency side of the same convergence. Industrial electric motors account for approximately 45% of global electricity consumption. HVAC systems — heating, ventilation, and air conditioning — are the largest single energy load in data centers, typically consuming 30–40% of a facility's total power budget.
Turntide's motor technology, which uses software-defined magnetic fields rather than fixed-geometry windings to achieve continuously variable efficiency at different speeds and loads, is particularly relevant for data center cooling systems. A data center that reduces its HVAC load by 30% through more efficient motor technology effectively increases its available compute capacity — or equivalently, reduces its power procurement requirement — by a material amount. At the scale of a hyperscale data center running 100+ megawatts, even a 10% efficiency improvement represents tens of millions of dollars per year in energy cost savings and thousands of tonnes of CO₂ per year in emissions reduction.
The Turntide investment thesis is not as dramatic as the Oklo or Electric Hydrogen theses — it is incremental improvement in a technology that already exists rather than development of a fundamentally new generation approach. But it represents exactly the kind of "correctly early" investment that delivers predictable returns: the technology works, the market is enormous, the economics are compelling, and the only barriers are adoption and distribution. In a sector where even small efficiency improvements translate to very large absolute savings, the total addressable market is effectively the installed base of all electric motors worldwide — one of the largest markets in industrial technology.
The Next Five Years: What the Energy-AI Convergence Looks Like at Scale
Looking forward from 2024, the trajectory of the AI energy convergence is reasonably clear in its broad outlines, even if the specific companies that will capture the opportunity remain uncertain. The demand side — AI compute growth — will continue to accelerate, driven by expanding model capabilities, expanding application deployment, and the competitive dynamics among hyperscalers that make unilateral slowdown strategically untenable for any individual participant. The supply side — energy infrastructure — will continue to be constrained by the physical and regulatory barriers described above.
The resolution of this mismatch will likely come through a combination of three mechanisms. First, on-site generation: hyperscalers and large-scale AI operators will increasingly procure dedicated power generation capacity — advanced fission, dedicated renewables with co-located storage, or some combination — deployed at or near their compute facilities, bypassing the constrained grid. Second, efficiency improvement: the energy intensity per unit of compute will decrease as hardware, cooling, and software technologies improve. Third, geographic arbitrage: some AI compute workloads will migrate to locations where electricity is abundant and cheap, even at the cost of increased latency — countries with excess hydroelectric or geothermal capacity (Norway, Iceland, Canada) may become significant AI compute hubs.
Each of these resolution mechanisms creates investable opportunities. On-site generation benefits companies like Oklo (nuclear), Electric Hydrogen (hydrogen storage for time-shifted renewables), and the thermal management companies building cooling infrastructure for high-density compute. Efficiency improvement benefits companies like Turntide (motor efficiency), and the broader ecosystem of power electronics, thermal management, and compute architecture companies. Geographic arbitrage benefits data center real estate developers and the infrastructure companies that can deploy reliably in resource-constrained international markets.
The common thread across all of these opportunities is that the AI energy convergence is a structural, multi-decade driver of demand for advanced energy infrastructure — not a cyclical trend, not a speculative theme, but a physical consequence of the decisions that the technology industry has already made and cannot reverse. The companies that capture this opportunity will do so by understanding both the technology layer (AI compute requirements) and the energy layer (advanced generation and efficiency technology) with equal depth.
Estes Capital is positioned at this intersection. If you are building in advanced energy infrastructure, industrial efficiency, or the enabling technologies for the AI power stack, we want to hear about what you are working on. Reach out.