Microsoft Copilot Reads Your Office Documents.
What That Means for GDPR.
Microsoft 365 Copilot is deployed in an estimated 70% of Fortune 500 companies. When a user invokes Copilot in Word, Excel, or Outlook, the entire document or email thread is sent as context to Microsoft's Azure OpenAI Service. If that document contains personal data — client names, medical records, financial details, HR files — GDPR's data processor obligations apply. Here's what legal and development teams need to know.
How Copilot processes your document data
When you ask Copilot to summarise a Word document, draft a reply in Outlook, or generate a chart from Excel data, the AI assistant receives the relevant document content as part of its prompt context. This context is sent to Azure OpenAI Service — Microsoft's hosted large language model infrastructure — for processing.
Microsoft's data processing documentation confirms that document content used with Copilot is processed by Azure OpenAI Service under the Microsoft Products and Services Data Protection Addendum (DPA). This means Copilot is acting as a data processor under GDPR Art. 4(8) — an entity that processes personal data on behalf of the data controller (your organisation).
"Processing by a processor shall be governed by a contract or other legal act under Union or Member State law... The contract shall stipulate... that the processor processes the personal data only on documented instructions from the controller."
For most organisations with active Microsoft 365 enterprise agreements, the DPA is already in place — it's part of the standard licensing terms. But the existence of a DPA doesn't automatically make Copilot usage GDPR-compliant. The controller (your organisation) remains responsible for ensuring:
- A lawful basis exists for the processing (GDPR Art. 6)
- Special category data (health, biometric, political opinion — Art. 9) is handled with additional safeguards
- Data minimisation applies — only necessary data reaches Copilot (Art. 5(1)(c))
- International transfer safeguards are in place if Azure processing occurs outside the EEA (Art. 44)
Where Copilot creates compliance risk: five scenarios
1. HR files in Word
An HR manager uses Copilot to draft an employee review, summarise disciplinary proceedings, or generate a report from a spreadsheet containing salary data. The document contains names, national identification numbers, health information, or union membership status — all special category data under GDPR Art. 9.
Special category data processing requires explicit consent or one of the Art. 9(2) exceptions. Using Copilot to process it requires that the DPA explicitly covers special category data handling — which standard Microsoft DPAs do, but which many organisations have not specifically reviewed for Copilot workflows.
2. Client contracts in Word
A legal team uses Copilot to redline a client contract, extract key terms, or generate a summary. The contract contains the client's personal data: name, address, email, potentially financial information. Forwarding this to Azure OpenAI for processing constitutes data processing on behalf of the controller — with the client as data subject.
If your privacy policy does not disclose that client data may be processed by AI services, and your client contracts don't contemplate sub-processing, this creates a transparency violation under Art. 13/14.
3. Patient records in Excel
A healthcare organisation uses Copilot to analyse patient outcome data in Excel, generate compliance reports, or create charts for clinical presentations. Health data is special category under Art. 9. Processing it with Copilot requires both the standard DPA and compliance with applicable health data processing laws (e.g., UK DSPT, German BDSG).
4. Financial data across jurisdictions
A finance team uses Copilot to process spreadsheets containing account numbers, transaction histories, or tax identifiers. Azure OpenAI Service may process this data in US-based data centres depending on the tenant configuration. Under GDPR Chapter V, international transfers require either an adequacy decision or standard contractual clauses (SCCs).
Microsoft's DPA includes SCCs for international transfers, but organisations must verify that their specific Azure region configuration keeps EU data within the EEA — or that the applicable SCC framework covers their use case.
5. Email threads in Outlook Copilot
Copilot in Outlook can summarise email threads, draft replies, and analyse correspondence. Email threads often contain unstructured PII — names, contact details, negotiation terms, health information shared informally. This PII enters Copilot's context without any structured data handling review.
The core compliance problem: Most organisations deployed Microsoft 365 Copilot without conducting a Data Protection Impact Assessment (DPIA) specific to AI-assisted document processing. A DPIA is mandatory under GDPR Art. 35 when processing "uses new technologies" and is "likely to result in a high risk" — conditions that enterprise AI clearly meets.
Copilot's built-in privacy controls: what they cover (and don't)
Microsoft provides several data governance controls for Copilot:
- Zero Data Retention (ZDR): Prompts and responses not retained beyond the session. Available for enterprise tenants with commercial data protection commitments.
- Sensitivity labels: Microsoft Information Protection labels can restrict which documents Copilot is allowed to access.
- Restricted SharePoint access: Copilot respects existing SharePoint permissions — it cannot access documents the user cannot access.
- EU Data Boundary: Available for some Microsoft 365 services — keeps processing and storage within the EU for qualifying tenants.
These controls address storage and access scope. They do not address the content layer: even with ZDR enabled, the document content (including all PII) is still sent to Azure OpenAI for processing during the session. The question is not whether Copilot retains your data — it's whether sending PII-rich documents to a cloud AI for processing is appropriate for your data classification and legal basis.
The data minimisation approach: anonymize before prompting
The most direct way to bring Copilot usage into GDPR compliance — without restricting its utility — is to apply data minimisation at the document level before invoking Copilot. This is exactly what the anonymize.dev Office Add-in is designed for.
- Document sent to Azure OpenAI with real names, IDs, and sensitive data
- GDPR Art. 5(1)(c) data minimisation not satisfied
- DPIA required — often skipped
- International transfer risk if non-EEA processing
- Special category data (health, HR) reaches cloud AI
- PII replaced with structured placeholders before Copilot invocation
- Art. 5(1)(c) data minimisation satisfied structurally
- DPIA scope reduced — no identifiable data in AI processing
- No international transfer risk — no personal data sent
- Copilot works on anonymized text, full output quality retained
The workflow is: (1) open your document, (2) use the Add-in to anonymize PII — names become [PERSON_1], emails become [EMAIL_1], account numbers become [IBAN_1], (3) invoke Copilot on the anonymized document, (4) restore real data with one click using the Add-in's encrypted mapping store.
Copilot operates with full context and generates equally useful output — it just processes clean, anonymized text rather than identifiable personal data. The only thing Copilot doesn't see is the personal data it didn't need to see in the first place.
What this means for your DPIA
If your organisation has not conducted a DPIA for Microsoft 365 Copilot usage, the following processing activities require review:
- Document types in scope: Inventory which document categories are routinely opened while Copilot is active. HR files, client contracts, and medical records require different DPIA treatment than internal process documentation.
- Data subjects affected: Map which categories of data subject's data reach Copilot — employees, clients, patients, prospective customers. Each category may have different lawful bases and transparency obligations.
- Sub-processor chain: Document that Azure OpenAI Service is a sub-processor under your Microsoft DPA, and that sub-processor notification obligations to data subjects are met (or covered by the processing notice).
- Residual risk treatment: For high-risk processing categories (Art. 9 special category data, large-scale profiling), document the technical controls applied — including document-level anonymization — as mitigating measures that reduce residual risk to acceptable levels.
EU AI Act intersection: From August 2, 2026, AI systems used in HR decision-making, credit assessment, healthcare, or law enforcement are classified as high-risk under Annex III of the EU AI Act. Microsoft 365 Copilot used in these contexts will require conformity assessments and technical documentation, including evidence of data quality controls. Document-level anonymization is a concrete technical control that can be cited in conformity documentation.
Practical steps for GDPR-compliant Copilot usage
- Conduct a Copilot-specific DPIA — using Microsoft's AI-readiness assessment templates as a starting point, extended to cover your specific document categories and data subjects.
- Enable EU Data Boundary for tenants with EEA data subjects, if available under your licensing tier.
- Review sensitivity labels — ensure documents containing Art. 9 special category data have labels that either restrict Copilot access or flag for mandatory anonymization before AI use.
- Deploy the anonymize.dev Office Add-in for team members who regularly work with high-sensitivity documents — HR, legal, finance, healthcare. Train them on the anonymize-then-prompt workflow.
- Update your privacy notice to disclose AI-assisted document processing as a processing activity, name Azure OpenAI Service (Microsoft) as a sub-processor, and describe the controls applied.
- Train staff on which document types require anonymization before Copilot use. Classification-based policies ("never use Copilot on documents labelled Confidential without anonymizing first") are more practical than blanket restrictions.
Sources
GDPR — Regulation (EU) 2016/679. Art. 5 (data minimisation), Art. 6 (lawful basis), Art. 9 (special categories), Art. 13/14 (transparency), Art. 25 (privacy by design), Art. 28 (processors), Art. 32 (security), Art. 35 (DPIA), Art. 44 (international transfers). gdpr-info.eu
Microsoft Products and Services Data Protection Addendum (DPA) — Microsoft Corporation. Available at microsoft.com
EU AI Act — Regulation (EU) 2024/1689. Annex III high-risk AI systems. eur-lex.europa.eu
88% of organisations use AI tools — McKinsey Global AI Survey 2024. McKinsey & Company.
Microsoft EU Data Boundary — Microsoft documentation. microsoft.com
Related Articles
Enterprise
Shadow AI in 2026: How Developers Leak Corporate Data
Unapproved AI tools and the six ways corporate data leaks.
Compliance
EU AI Act August 2026 Developer Checklist
High-risk AI provisions, GPAI obligations, and the August deadline.
OWASP
OWASP Agentic AI Top 10: PII Risks & Mitigations
Memory poisoning, tool misuse, and data exfiltration risks.
GDPR-compliant Copilot usage starts with the Add-in
Anonymize PII in your Office documents before Copilot processes them. Full output quality. No personal data in Azure OpenAI context.