🤖 AI Summary
MIT announced CRESt (Copilot for Real-world Experimental Scientists), a multimodal AI platform and lab “copilot” described in Nature that integrates literature, databases, images, human feedback, robot synthesis/characterization and experimental data to plan, run and iteratively refine real experiments. Built on active learning but extending beyond narrow Bayesian optimization, CRESt uses vision-language models and cameras to monitor runs, flag small errors for reproducibility, and accepts natural-language prompts so researchers can ask it to review microscope images, propose material recipes, and execute automated tests.
Technically, CRESt fuses multimodal sources to build richer priors and explore broader hypothesis spaces than conventional optimization—then closes the loop by running experiments with robotics and folding results (plus human critique) back into its models. In a fuel-cell demonstration it evaluated >900 chemistries and ran ~3,500 electrochemical trials in three months, discovering a multielement catalyst with reduced palladium content and record performance. The platform’s significance lies in accelerating materials discovery, improving reproducibility via active monitoring, and framing AI as an interactive experimental partner rather than a black‑box optimizer—paving the way for more flexible, self‑driving labs across scientific domains.
Loading comments...
login to comment
loading comments...
no comments yet