What does economics tell us about AGI? – Phil Trammell (epoch.ai)

🤖 AI Summary
Stanford economist Phil Trammell reviewed what economic theory actually implies about AGI in an episode of Epoch After Hours, arguing that current literature leans too heavily on one growth family — semi‑endogenous (Jones) models — and that this biases predictions about explosive growth. He highlights three problems: too few researchers engage the question; overreliance on models that treat research headcount as the binding constraint (yielding unrealistic “infinite” gains if R&D is automated); and failure to consider alternative frameworks, like Schumpeterian models where innovations require sparse monopoly rents and automating R&D may have limited effect. Trammell stresses that economic theory is useful mainly for overturning naive intuitions and that empirical grounding is essential. He advocates richer task‑based approaches that infer the fraction of automated tasks from labor stats, task length trends, and staged “human‑ability” benchmarks rather than speculative compute‑to‑AGI mappings. On empirical detection and policy-relevant prediction, Trammell revisits Nordhaus’s “no singularity yet” claim, updating data through 2021 and finding no macroeconomic signal of an impending explosion. He proposes a leading indicator—the network‑adjusted capital share (how much revenue ultimately accrues to capital through supply chains)—and finds semiconductor shares roughly flat, arguing this weakens the case for an imminent industrially driven singularity. A crucial caveat: if production exhibits increasing returns to scale at large AI‑driven scales (fast replication of agents, shared knowledge, “Jupiter brains”), explosive growth becomes much more plausible. The takeaway: broadening models, tracking concrete task and capital‑share data, and testing for scale effects are vital to turning AGI economic speculation into grounded forecasts.
Loading comments...
loading comments...