Copilot has been reading your emails for weeks without your consent - what now?

Do you really think your confidential emails are actually confidential?

2comments
Microsoft 365 Copilot Chat logo.
Microsoft 365 Copilot Chat logo. | Image by Microsoft

Companies are aggressively implementing AI tools into their products these days. However, a major concern with these tools is that sometimes they forget to give priority to privacy, which ultimately leads to us losing our trust in them. A similar incident recently occurred when Microsoft Copilot Chat reportedly compromised users' privacy by reading and summarizing emails marked as confidential.

Privacy? What privacy?


In September last year, Microsoft products like Word, Excel, PowerPoint, Outlook, and OneNote received access to Copilot Chat. It's basically an AI-powered, content-aware chat that is available to paid Microsoft 365 users. While the tool can really turn out to be helpful in different situations, a recent security glitch in it has resulted in a privacy nightmare.

Microsoft Copilot Chat has reportedly been reading and summarizing emails that are stored in the Sent Items and Drafts folders. This even includes emails that have been explicitly marked as confidential. What's even more surprising is that all this has happened even when the DLP (data loss prevention) policy is enabled. For reference, DLP policies are there to ensure that confidential data is not processed by these AI products.

Recommended For You

The AI assistant's access to confidential emails clearly indicates that it has bypassed the DLP policy and violated user privacy and security. The problem was first reported almost a month ago and is being tracked under CW1226324.

Microsoft has acknowledged this issue and stated that it happened due to an unspecified code error. The tech giant has also confirmed that it is still investigating the situation. A fix has been found and is being rolled out to a small set of users, some of whom may be contacted to confirm that the fix works as intended.

The company has not provided any information on the number of organizations that were impacted by the software bug. There's also no confirmation on when the patch will be fully rolled out to all the affected customers. As of now, the only solution available is that administrators should keep an eye on the Microsoft 365 admin center for any updates under reference CW1226324. They should also closely monitor for any unwanted access to private and confidential content by Copilot.

Do you trust AI tools with your personal information?
14 Votes

AI and privacy are two faces of a coin



It doesn't matter how much AI companies say, "We value your privacy, blah blah blah…" there's often a loophole that's exploited to feed your data to AI. And when there isn't any such loophole, incidents like the one mentioned above happen that give AI access to even your confidential data.

Imagine how devastating it would be if hackers somehow managed to gain unauthorized access to AI tools that are aware of all your personal information and confidential data. Interestingly, a very similar incident occurred last month when hackers reportedly attacked Copilot to steal users' data.

Since more and more AI tools are being integrated into the workforce, it has become more important than ever for companies to implement strong security measures to ensure security breaches don't happen. You also need to make sure that you're not sharing personal information with any AI chatbots like Gemini or ChatGPT.

Try Noble Mobile for only $10

Get unlimited talk, text, & data on the T-Mobile 5G Network plus earn cash back for data you don’t use.
Buy at Noble Moblie
Google News Follow
Follow us on Google News

Recommended For You

COMMENTS (2)

Latest Discussions

by menooch18 • 1
by darkdrak88 • 4
FCC OKs Cingular\'s purchase of AT&T Wireless