Microsoft says bug causes Copilot to summarize confidential emails (www.bleepingcomputer.com)

🤖 AI Summary
Microsoft has confirmed a significant bug in its Microsoft 365 Copilot that causes the AI assistant to summarize confidential emails, undermining data loss prevention (DLP) measures. This issue, discovered on January 21, specifically affects the Copilot "work tab" chat feature, which incorrectly accesses and processes emails labeled with sensitivity tags, intended to restrict automated tools from handling sensitive information. The bug compromises the confidentiality of users’ communications by allowing Copilot to summarize messages found in Sent Items and Drafts folders, directly violating organizational privacy protocols. This incident holds serious implications for the AI and machine learning community, particularly in the context of ensuring data security and compliance in AI applications. Microsoft has identified the problem as stemming from a code error and has initiated a fix rollout as of early February. However, the company has not shared how many users are impacted or a full timeline for resolution, indicating that the investigation is ongoing. As reliance on AI tools grows, this event underscores the critical importance of maintaining robust safeguards around sensitive data, prompting further discussions regarding the integration of AI in professional settings while upholding privacy standards.
Loading comments...
loading comments...