OpenAI cofounder says scaling compute is not enough to advance AI: 'It's back to the age of research again' (www.businessinsider.com)

πŸ€– AI Summary
OpenAI cofounder Ilya Sutskever told the Dwarkesh Podcast that the era of simply scaling up compute and hoovering data β€” the dominant strategy of the past half-decade for improving LLMs and image models β€” is reaching diminishing returns. He argued that companies have already amassed large GPU farms and datasets, and that throwing 10x–100x more compute at current architectures won’t by itself produce transformative gains. β€œSo it’s back to the age of research again, just with big computers,” he said, emphasizing that compute remains important but must be paired with new scientific breakthroughs to be productively used. The significance for the AI community is practical and strategic: expect a pivot from primarily engineering and infrastructure investments toward fundamental research on algorithms, learning theory and sample efficiency. Key technical priorities Sutskever highlighted include improving generalization so models learn from small amounts of data more like humans do, and developing methods that extract more value from existing compute and data. That implies renewed emphasis on architectures, training objectives, inductive biases, and evaluation metrics β€” and it could reshape funding, talent allocation, and safety work as firms seek higher-leverage advances rather than brute-force scaling.
Loading comments...
loading comments...