Microsoft Copilot ignored sensitivity labels twice in eight months — and no DLP stack caught either one
Neutral
0.0
For four weeks starting January 21, Microsoft's Copilot read and summarized confidential emails despite every sensitivity label and DLP policy telling it not to. The enforcement points broke inside Microsoft’s own pipeline, and no security tool in the stack flagged it. Among the affected organizations was the U.K.'s National Health Service, which logged it as INC46740412 — a signal of how far the failure reached into regulated healthcare environments. Microsoft tracked it as CW1226324. The advisory, first reported by BleepingComputer on February 18, marks the second time in eight months that Copilot’s retrieval pipeline violated its own trust boundary — a failure in which an AI system accesses or transmits data it was explicitly restricted from touching. The first was worse.In June 2025, M
Pulse AI Analysis
Pulse analysis not available yet. Click "Get Pulse" above.
This analysis was generated using Pulse AI, Glideslope's proprietary AI engine designed to interpret market sentiment and economic signals. Results are for informational purposes only and do not constitute financial advice.