🤖 AI Summary
A team built ProteoGPT, a 124M-parameter protein large language model pre-trained on 609,216 high-quality Swiss-Prot sequences, then used transfer learning to create a pipeline of specialized submodels—AMPSorter (AMP classifier), BioToxiPept (toxicity classifier) and AMPGenix (sequence generator)—to mine and generate antimicrobial peptides (AMPs) at scale. Using this ensemble, they screened hundreds of millions of peptide candidates and produced mined and de novo AMPs that showed potent in vitro activity against ICU-derived carbapenem-resistant Acinetobacter baumannii (CRAB) and methicillin-resistant Staphylococcus aureus (MRSA), reduced propensity for resistance development, and comparable or superior efficacy to clinical antibiotics in mouse thigh-infection models without detectable organ damage or gut-microbiome disruption. Mechanistic assays indicate membrane disruption and depolarization as primary killing modes, and peptides also exhibited anti-inflammatory effects.
Key technical points: ProteoGPT’s curated Swiss-Prot foundation aims for biologically faithful pretraining; AMPSorter achieved AUC 0.97 and AUPRC 0.96 on a stringent, low-similarity benchmarking set and maintained 96.4% precision on sequences containing unnatural amino acids. BioToxiPept reached AUPRC ≈0.92 for toxicity detection, lowering false negatives in validation. AMPGenix can unconstrainedly generate 8–15–residue peptides using temperature-controlled sampling to balance diversity and novelty. Together, this demonstrates a practical generative-AI workflow for high-throughput AMP discovery that accelerates exploration of peptide space and yields therapeutically promising candidates against multidrug-resistant pathogens.
Loading comments...
login to comment
loading comments...
no comments yet