If AI is a GPT, Are We Missing a Complement?
Examining Current AI Discourse from a History-Informed Approach
Abstract
Artificial intelligence is plausibly a general‑purpose technology (GPT), and the current AI discourse overlooks at least one potentially decisive complement: the ability to scale continuously-available, low‑carbon electricity. Drawing on nineteenth‑century mis‑predictions—Qing vs. Tokugawa and Ottoman/Turkish vs. Meiji—this paper argues that rapidly deployable nuclear‑fission baseload, especially via small‑modular reactors (SMRs), may determine which economies can translate frontier AI into broad productivity gains. Empirical power‑curve data show frontier‑model training demand doubling roughly every year and inference demand doubling every three years, implying a plausible multi‑hundred‑terawatt‑hour baseload gap by the early 2030s—far beyond the scaling potential of renewables‑plus‑storage. Mainland China’s 150 GW nuclear pipeline illustrates how overlooked complements can invert the rankings suggested by GPU or talent metrics alone, outpacing the United States despite its semiconductor lead.
The analysis concludes that states should hedge complement uncertainty by fast‑tracking SMR licensing and finance alongside other policy initiatives. Otherwise, today’s AI‑readiness dashboards may prove as myopic as nineteenth‑century trade‑balance tables that failed to foresee Japan’s industrial leapfrogging.
1. Introduction
A general‑purpose technology (GPT) is a breakthrough that, like the steam engine or electrification, finds applications across many sectors, improves rapidly, and demands complementary investments—skills, infrastructure, business models—before its full economic impact is felt. Artificial intelligence—especially its prospective culmination in artificial general intelligence (AGI)—fits this definition: it is already seeping into code writing, molecular design, and logistics, its capabilities double on an annual cadence, and it calls for costly complements.
Current AI‑readiness conversations typically underline a familiar constellation of complements: frontier‑class GPU clusters and specialised accelerators; vast, well‑curated proprietary data sets; deep pools of machine‑learning talent (often buttressed by liberal immigration regimes); clear regulatory frameworks for data privacy, liability, and safety evaluation; and ample venture or sovereign capital capable of underwriting billion‑dollar training runs. These are crucial—but they share an unstated assumption: electricity supply can scale up as necessary and does not form a bottleneck. This paper argues that nuclear fission capacity, especially via small‑modular reactors, may prove a more decisive complement—yet it is largely absent from current readiness indices and investment road‑maps.
2. General‑Purpose Technologies
General‑purpose technologies (GPTs) are foundational innovations whose impact radiates far beyond any one sector. Economists Bresnahan and Trajtenberg (1995) define a GPT by three traits:
Pervasiveness—the technology can be applied across many industries (e.g. steam engines powered mines, factories, and railways).
Continuous technical improvement—the core technology keeps getting cheaper and more capable, making new uses viable (electric motors from Edison dynamos to micro‑drives).
Wide‑ranging complementarities—real benefits emerge only when firms and governments invest in co‑inventions such as new skills, business models, or infrastructure.
Classic GPTs include the Watt steam engine, electrification, and digital computing. Each triggered growth waves only after decades of complementary investment: factory re‑layout for electric motors, interstate highways for automobiles, organisational IT restructuring for computers.
Artificial intelligence—especially its potential culmination in artificial general intelligence (AGI)—exhibits the same signature. Large language models already play roles in code generation, drug design, and logistics; capabilities improve on an annual cadence; and they demand costly complements ranging from specialised chips to data‑governance regimes. Whether AGI becomes transformative therefore hinges less on raw algorithmic progress than on which societies assemble the right hidden complements. The sections that follow show how past observers repeatedly mis‑identified those complements—and why energy‑system readiness may prove the overlooked GPT enabler today.
3. Historical Mis‑Specification
Tokugawa Japan vs. High‑Qing Mainland China (c. 1800)
The transformative GPT on the horizon in 1800 was the steam‑powered factory system—initially textile spindles and iron rolling mills. European merchants assumed the Qing Empire, with its gigantic domestic market and entrenched silver flows, would adopt industrial steam first. Yet two unseen complements were brewing in Tokugawa Japan:
Mass literacy and numeracy. By 1800 as many as 40 % of commoners attended terakoya schools; printed kanbun primers circulated through commercial lending libraries. That literacy pool became essential for running and maintaining imported steam machinery and for double‑entry cloth‑mill accounting.
Proto‑financial depth. Osaka’s rice‑futures exchange—arguably the world’s first standardised commodities market—provided credit instruments, forward contracts and risk‑transfer mechanisms that lowered capital costs for mechanised mills.
Qing China, despite larger factor endowments, lacked both complements: broad‑based literacy remained confined to the exam‑elite; merchant remittance networks were powerful but regionally siloed. When Western iron steamers arrived, Japan could mobilise talent and finance in a decade; the Qing could not. By 1890 Meiji cotton‑spindle capacity exceeded China’s threefold, and Japan fielded steam‑powered warships that defeated the Beiyang Fleet.
Meiji Japan vs. Ottoman/Turkish Reforms (1868–1923)
By the 1880s the next GPT wave—late‑steam railways, telegraphy and early electrification—required both engineering talent and centralised fiscal capacity. European economists wagered the Ottoman Empire would industrialise alongside Japan: it enjoyed access to London capital markets and was already importing British locomotives. Two mis‑specified complements upended that forecast:
Universal elementary schooling. Japan’s 1872 Gakusei law pushed enrolment above 90 % by 1905, creating a broad technician and clerk class that could absorb telegraph codes, blueprint reading and dynamo maintenance. Ottoman primary enrolment remained under 20 % until the Young Turk era, limiting the skilled workforce.
Cohesive fiscal centralisation. The Meiji state overhauled tax collection, created the Bank of Japan (1882), and channelled surplus into zaibatsu industrial combines. Ottoman finance remained fragmented across provinces and subject to European debt control, choking railway extensions and delaying electrified streetcar concessions.
By 1913 Japan’s factory output was growing ~9 % annually, while Ottoman industrial share stagnated. Agrarian exports, not electrified factories, dominated Turkish GDP through the 1920s.
Lesson. Historically, observers have repeatedly failed to identify the actual complements a GPT would require. That historical blind spot warns us that today’s overlooked complement may be something as unglamorous—but indispensable—as a bunch of merchants in Osaka talking rice.
4. Complement Uncertainty Today
“If you look at Chinese power production versus U.S. power production, the Chinese graph is straight up and to the right … AI capabilities ultimately depend on incredible amounts of energy.” — Alexandr Wang (CSIS 2025)
4.1 Why electricity moves from cost variable to hard constraint
The transformative nature of AI makes decade‑long forecasts especially treacherous: language‑model inference demand depends on how deeply AGI capabilities become woven into the wider economy—an adoption curve that is itself hard to forecast ex ante—so its decade‑scale growth is intrinsically uncertain, whereas the training cost of frontier models follows a clearer hardware‑scaling trajectory that analysts have tracked for half a decade. That distinction shapes the stress‑test below. Recent studies separate AI workloads into two very different energy trajectories:
Mainstream forecasts—such as the IEA’s 2024 Electricity outlook, which projects data‑centre use merely doubling by 2030—rely on linear extrapolations of efficiency gains and assume limited adoption breadth. Our stress‑test suggests those models may severely underestimate long‑run electricity demand because they omit two compounding drivers: (i) scaling laws that push every new frontier model up by an order of magnitude and (ii) second‑order economic diffusion once AGI becomes a genuinely general‑purpose technology. If either factor materialises at half the historical rate observed in early LLM deployments, aggregate load would already exceed IEA’s ‘high’ scenario by the early 2030s.
By 2035 AI could plausibly require a continuous electricity flow equivalent to 30 % of all 2024 global generation—most of it 24 / 7 baseload for multi‑week training runs.
4.2 Why only nuclear fission can close the 24 / 7 gap in the 2030s window
Intermittent renewables require massive over‑build plus long‑duration storage to deliver round‑the‑clock power. Delivering a steady 1 GW load from solar‑wind hybrids typically demands 3‑5 GW of nameplate capacity plus 8‑10 hours of battery or pumped‑hydro storage, pushing levelised costs above $120 MWh⁻¹ (IEA 2024). Scaling that to multi‑gigawatt AI campuses compounds mineral bottlenecks—lithium, nickel, rare‑earth magnets—already stressed by EV demand.
Fossil generation can deliver 24 / 7 power today, but doing so at scale clashes with tightening carbon‑pricing regimes and statutory net‑zero targets. At a CO₂ price of US $60–100 t⁻¹ (already legislated in the EU ETS and proposed in several U.S. states), unabated gas‑turbine electricity exceeds US $110 MWh⁻¹—on par with or costlier than SMRs, while adding > 400 g CO₂ kWh⁻¹. Carbon‑capture retrofits can cut emissions by 90 % but add another $25‑40 MWh⁻¹ and consume ~15 % of plant output, further eroding economic viability. Consequently, most G20 decarbonisation road‑maps leave little headroom for new unabated fossil capacity in the 2030s, making it a politically and financially risky bet for long‑term AI baseload.
Hydro & geothermal can offer true baseload in a handful of locations—think Iceland’s geothermal datacentres or Norway’s surplus hydro—but their global expansion potential is severely limited. The best hydro sites are already dammed; new large dams face ecological‑social pushback, and prospective capacity adds only ~70 GW worldwide by 2035 (IEA 2024). Geothermal is constrained by favourable geology; even generous DOE estimates put scalable flash‑steam potential below 200 GW globally—< 3 % of the stress‑test AI load. Crucially, most of the regions where large‑scale AI compute centres are clustering—Pacific Northwest cloud zones, Northern Virginia’s "Data‑Center Alley," Phoenix’s solar‑fed campuses, Dublin and Frankfurt in the EU, and the Seoul‑Incheon and Shenzhen‑Guangzhou megaregions in Asia—possess little untapped hydro headroom and minimal high‑enthalpy geothermal resources. The compute hubs cannot count on local renewable baseload, sharpening the case for deploy‑anywhere fission.
Small‑modular reactors (SMRs) offer a fundamentally different nuclear‑build paradigm:
Factory‑fabricated modules. Pressure vessels, steam generators, and containment systems are built on dedicated production lines, shipped by rail or barge, and installed “plug‑and‑play” on site—cutting quality‑control risks that plagued gigawatt‑scale stick‑built reactors.
Shorter schedules. Standardised modules plus reduced on‑site civil works target 36–48 months from first concrete to grid‑sync for FOAK units; clone copies could drop to < 30 months.
High capacity factor (≥ 90 %) and load‑following. Passive safety systems (natural‑circulation core cooling, gravity‑fed ECCS) allow flexible output without costly fossil peakers, making SMRs ideal for variable AI‑training loads that nonetheless require 24 / 7 availability.
10–300 MW electric blocks. A single 77 MW NuScale VOYGR or 100 MW GE‑Hitachi BWRX‑300 module can be co‑located with a hyperscale datacentre; sites can scale by adding more pods as demand grows—mirroring cloud “availability‑zone” doctrine.
Minimal exclusion zones. Integrated containment and passive decay‑heat removal shrink emergency‑planning zones to the plant fence line (< 1 km), unlocking brownfield siting at retired coal plants or even industrial parks.
Competitive cost trajectory. DOE Cost Study (2024) projects US $75–90 MWh⁻¹ LCOE for nth‑of‑a‑kind SMRs by early 2030s—20–40 % below dispatchable renewables + storage and ~30 % below unabated gas at $80 t‑CO₂.
Environmental profile. Life‑cycle analyses show SMRs emit < 15 g CO₂‑eq kWh⁻¹—on par with wind and an order of magnitude below gas with CCS (UNSCEAR 2016; IPCC 2022). Land footprint is tiny: a 300 MW SMR campus fits in ~12 ha versus > 5 000 ha of solar‑plus‑storage for the same firm output. Cooling demand can be air‑based or use recycled municipal water, mitigating freshwater constraints. High‑level waste volume per kWh is comparable to large reactors, but smaller core inventories enable on‑site dry‑cask storage for decades, buying time for eventual deep‑geologic repositories.
These traits map cleanly onto AI‑compute needs: modularity matches datacentre scaling, high uptime suits multi‑week training runs, small footprint eases siting near load, and low life‑cycle carbon aligns with net‑zero mandates.
4.3 Fission as a Hidden GPT Complement
The examples above imply an unsettling parallel with the 19th‑century cases: observers in 1800 tracked Qing trade volumes yet missed Tokugawa literacy; planners in 1880 prized Ottoman access to European capital yet overlooked universal schooling. In the same vein, rapidly deployable fission capacity may constitute the overlooked complement that determines which states can transform AI from laboratory showcase to economy‑wide GPT. If the stress‑test electricity gap materialises and only a subset of nations can bring SMRs online at scale—due to supply‑chain depth, licensing agility, or public‑finance tools—those nations could enjoy a multi‑year window of compute abundance that accelerates domestic productivity, defence applications, and AI research itself. The complement is “hidden” not because energy is unfamiliar, but because today’s readiness frameworks treat energy as a price variable, not a binding technical pre‑condition.
Mainstream projections—IEA’s “data‑centre demand doubles” case, Goldman Sachs’ 3× scenario—anchor policy debates, yet they almost certainly low‑ball plausible electricity demand because they assume linear efficiency gains and limited GPT diffusion. When forecasters under‑shoot the requirement by an order of magnitude, energy is framed as a cost variable, not a capacity constraint.
That framing means the ability to rapidly scale fission baseload—particularly via SMRs—plausibly remains a complement whose very status as a complement is under‑recognized ex ante. The blind spot rhymes with 19th‑century mis‑specifications: observers prized Qing trade volumes yet missed Meiji literacy, or hailed Ottoman capital inflows while overlooking universal schooling. If nuclear agility turns out to be decisive, today’s readiness dashboards will look as myopic tomorrow as those earlier benchmarks appear in hindsight.
5. Illustrative 2025 Contrast: Mainland China vs. United States
Mainland China and the United States appear neck‑and‑neck on most headline AI‑readiness metrics—GPU clusters, talent pipelines, venture funding—but they diverge sharply on their ability to deliver continuous, low‑carbon baseload in the crucial 2025‑2035 window.
Implications. If fission baseload turns out to be the hidden complement that gates AGI scale, Beijing could unlock tens of GW of new 24 / 7 power by early 2030s, while Washington might still be waiting for FOAK SMRs to clear regulatory and funding hurdles. That gap mirrors the 19th‑century mis‑specification: outside observers bet on the Ottoman Empire’s European capital inflows, yet Japan’s overlooked literacy and fiscal cohesion let it industrialise first. Today’s analysts may overweight U.S. semiconductor design dominance while under‑weighting Mainland China’s energy‑infrastructure surge.
6. Conclusion
History teaches a sober lesson: the complements that unlock a general‑purpose technology often remain invisible until after diffusion begins. Early‑nineteenth‑century merchants prized Qing silver flows and Ottoman access to London capital, yet mass literacy, fiscal cohesion, and proto‑finance turned out to be the catalytic enablers in Meiji Japan. Today’s AI‑readiness dashboards risk the same error. By treating electricity as a priced‑in commodity rather than a hard technical prerequisite, they under‑estimate a looming constraint that even optimistic energy outlooks do not capture.
If the empirically observed 12‑month doubling in training power and three‑year doubling in inference load persist—even at half speed—AI workloads alone could require hundreds of terawatt‑hours of continuous, low‑carbon baseload within a decade. Intermittent renewables with storage, geographically capped hydro, and politically constrained fossil generation cannot scale quickly or cleanly enough. Small‑modular fission reactors are, for now, the only technology offering grid‑agnostic, 90 %‑uptime power at a levelised cost below dispatchable renewables. Yet not one major readiness index tracks SMR licensing agility, supply‑chain depth, or nuclear‑finance instruments.
The policy implication is straightforward. States that wish to lead—or simply remain competitive—in an AGI world must adopt portfolio hedging against complement uncertainty. Rapid fission build‑out should sit alongside chip subsidies, data‑governance reforms, and talent visas. Governments that begin permitting, financing, and standardising SMRs today will create an option value that those who wait cannot replicate on short notice.
In the nineteenth century, literacy and proto‑finance were the hidden complements that let late‑arriving Japan overtake larger polities. In the twenty‑first, scalable nuclear baseload may play the same quiet, decisive role.
References
Bresnahan, Timothy F., and Manuel Trajtenberg. 1995. “General Purpose Technologies: ‘Engines of Growth’?” Journal of Econometrics 65 (1): 83–108.
Datawrapper. 2025. “Nuclear-Build Readiness: Mainland China vs. United States.”
DOE (U.S. Department of Energy). 2024. Advanced Small Modular Reactor Cost Study: nth-of-a-Kind Projections for the Early 2030s. Washington, DC: Office of Nuclear Energy.
Epoch AI. 2025. Compute and Power Trends Across Large AI Models, 2019-2025. Epoch Policy Brief 11.
Goldman Sachs Research. 2024. “The Power Behind the Cloud: AI Electricity Demand Could Triple by 2030.” Global Energy Note, October 2024.
IEA. 2024. Electricity 2024 – Annex: Artificial Intelligence and Data-Centre Outlook. Paris: International Energy Agency.
IPCC. 2022. Climate Change 2022: Mitigation of Climate Change. Contribution of Working Group III to the Sixth Assessment Report. Geneva: Intergovernmental Panel on Climate Change.
OECD. 2024. OECD AI Policy Observatory – Going Digital Indicators. Paris: Organisation for Economic Co-operation and Development.
Oxford Insights. 2024. Government AI Readiness Index 2024. London: Oxford Insights.
Stanford Institute for Human-Centered AI. 2025. AI Index Report 2025. Stanford, CA: Stanford University.
UNSCEAR. 2016. Sources, Effects and Risks of Ionizing Radiation: 2016 Report to the General Assembly. New York: United Nations Scientific Committee on the Effects of Atomic Radiation.
Wang, Alexandr. 2025. “Scale AI’s Alexandr Wang on Securing U.S. AI Leadership.” Remarks at the Center for Strategic and International Studies, May 1, 2025. Transcript lines 121–124.
Wells Fargo Securities. 2024. “AI Compute Energy Demand: A First Look at the Power Curve.” Investor Note, March 2024.