Crowned as the ultimate productivity companion in the era of AI, Microsoft Copilot stands as a formidable asset for today’s businesses. However, as the adage goes, with great power comes great responsibility.
In the realm of data security, particularly within organizations with limited visibility into their security posture, Copilot and similar next-gen AI tools pose a significant risk. They have the potential to inadvertently divulge sensitive information to unauthorized personnel or even malicious actors.
So, how does Microsoft Copilot operate?
Integrated seamlessly into each of your Microsoft 365 applications—Word, Excel, PowerPoint, Teams, Outlook, and beyond—Copilot serves as an AI assistant. Leveraging a user’s existing Microsoft permissions, Copilot assists users in various tasks, from summarizing meeting notes to locating sales assets and identifying action items, thereby saving invaluable time.
However, when organizational permissions are not configured correctly and Copilot is active, the risk of inadvertently exposing sensitive data increases exponentially.
Why does this pose a problem?
The issue lies in the overwhelming access to data granted to individuals. On their first day of work, the average employee can potentially access a staggering 17 million files. When there’s a lack of visibility and control over who can access sensitive data, the potential for damage escalates. Additionally, many permissions granted remain unused and are categorized as high-risk, exposing sensitive data to individuals who do not require access.
A research group simulated vividly how seemingly innocuous prompts can easily expose an organization’s sensitive data through Copilot. Below are the prompts.
Prompt #1: A Prompt to show new employee data.
Employee data often comprises highly sensitive information, including social security numbers, addresses, salary details, and more. Without adequate protection, this data is at risk of falling into the wrong hands, potentially leading to severe consequences.
Prompt #2: Prompt to reveal recent bonuses and awards
Copilot operates without the ability to discern whether users should have access to specific files—it focuses solely on enhancing productivity based on the user’s existing permissions. Consequently, if a user queries about sensitive topics such as bonuses, salaries, or performance reviews, and your organization’s permission settings are not adequately restricted, there’s a risk of unauthorized access to this information.
Prompt #3: Prompt requesting for any files with credentials in them?
Users have the capability to delve deeper into questions related to authentication by requesting Copilot to summarize authentication parameters and compile them into a list. Consequently, the prompter may end up with a table containing logins and passwords, potentially spanning across various cloud services, thereby escalating the user’s privileges further.
Prompt #4: Prompt to identify any documents containing APIs or access keys and compile them into a list,
Copilot also has the capability to leverage data stored within cloud applications linked to your Microsoft 365 ecosystem. With this AI tool, it can readily uncover digital secrets that grant access to various data applications.
Prompt #5: Prompt to provide details regarding the acquisition of any particular company
Users have the ability to inquire of Copilot about mergers, acquisitions, or particular deals and capitalize on the data supplied. Merely requesting information can yield details such as purchase prices, specific file names, and additional insights.
Prompt #6: Prompt to show all files containing sensitive data
Arguably the most concerning prompt of all is when end users explicitly request files containing sensitive data. When such confidential information resides in locations where it shouldn’t, it becomes readily accessible to all individuals within the company and the next-generation AI tools they utilize.
How can you prevent Copilot prompt-hacking?
To safeguard against Copilot prompt-hacking, it’s crucial to establish robust data security measures before enabling Copilot. Even with proper safeguards in place, it’s essential to prevent the expansion of your organization’s exposure risk and ensure safe data utilization.
Take proactive steps to mitigate risks without compromising productivity. Are you planning to ensure a secure rollout of Microsoft Copilot in your organization? Consult a competent data security specialists. You can start by visiting the Azure Marketplace.