🤖 AI Summary
Microsoft’s Gaming Copilot was found to be collecting users’ screenshots for LLM training by default: a ResetEra user (“RedbullCola”) uncovered network traffic showing the Copilot app performs OCR on game screenshots and transmits the extracted text to Microsoft servers. The outlet verified the behavior independently and found the model-training toggle for screenshot text collection is enabled out of the box (voice-chat training remains off by default). Users can disable it via Game Bar → Gaming Copilot → Settings → Privacy and uncheck the training slider.
This is significant because automatic screenshot harvesting can leak sensitive or proprietary text (e.g., NDA-protected game content) and may expose personal data without clear notice or consent. From a technical and compliance standpoint, sending OCR’d text to train LLMs raises data-minimization and transparency issues under regimes like GDPR, potentially creating legal exposure for Microsoft and privacy risks for users. Immediate practical takeaway: audit the Gaming Copilot privacy setting, and developers/operators should treat automated screenshot collection as high-risk telemetry that requires explicit opt-in, clear disclosure, and careful filtering before inclusion in training pipelines.
Loading comments...
login to comment
loading comments...
no comments yet