🤖 AI Summary
Google announced "Project Suncatcher," a research moonshot to put its custom AI chips (TPUs, specifically Trillium-generation) on satellites and eventually scale machine learning in orbit. The company plans two prototype satellites for early 2027, built with Planet Labs, and says early radiation tests in a particle accelerator indicate TPUs can survive low-Earth-orbit conditions. The vision is fleets of solar-powered satellites in near-constant sunlight, using space-based lasers for inter-satellite high-bandwidth links to approach the compute density of terrestrial data centers.
The plan is significant because it reframes data-center scaling around abundant space solar power and reduced terrestrial resource use (land, water), and joins similar moves by SpaceX, Amazon interests, and startups like Starcloud. Key technical points and hurdles: vacuum cooling is nontrivial—Google suggests heat pipes and radiators but offers few details—plus thermal management, on-orbit reliability, and high-bandwidth ground comms. Google also notes launch-cost sensitivity, arguing that with reusable rockets and projected price drops to under ~$200/kg by the mid-2030s, space-based centers could approach parity with terrestrial energy costs. If solved, the effort could shift how we provision large-scale AI compute, but it hinges on breakthroughs in cooling, networking, reliability, and launch economics.
Loading comments...
login to comment
loading comments...
no comments yet