
Microsoft's AI Read Executives' Confidential Emails for a Month. Microsoft's Security Tools Were Supposed to Stop It.
A bug tracked as CW1226324 allowed Microsoft 365 Copilot to bypass Data Loss Prevention policies and summarize emails marked "Confidential" in users' Sent Items and Drafts folders. The flaw was active from at least January 21, 2026. Microsoft disclosed it publicly in mid-February and began rolling out a patch — nearly a month after the breach started. The setup is almost too on-the-nose. Enterprise customers pay Microsoft for two products that are supposed to work together. The first is Microsoft Information Protection, which applies sensitivity labels — "Confidential," "Highly Confidential," "Internal Only" — to documents and emails. The second is Copilot, the AI assistant embedded across Microsoft 365 that reads, summarizes, and acts on enterprise data. The entire selling proposition of DLP is that it governs what Copilot can see. When a CISO tags a board compensation memo as "Confidential," the expectation is that Copilot can't index it, summarize it, or surface it in a colleague's
Continue reading on Dev.to
Opens in a new tab



