The paid tier connects to everything your firm has ever stored. That is its power — and its governance challenge. Here is the complete control framework before you deploy a single licensed seat.
April 1, 2026

The first two parts of this series covered the licensing landscape and the controls for Copilot Chat Basic. This post is about what happens when you deploy M365 Copilot Premium — the paid tier at $30 per user per month (or $21 for firms under 300 users). The governance stakes are categorically higher. Where Copilot Chat Basic is web-grounded and sees only what a user manually provides, M365 Copilot Premiumconnects to your entire Microsoft 365 tenant via the Microsoft Graph. It can reason over every email, every Teams conversation, every document in SharePoint and OneDrive that the licensed user has permission to access. That sentence should stop any CIO in their tracks before sign-off.
This is not an argument against deploying M365 Copilot Premium. The productivity case is real and the data protection framework is robust — but only if it is configured. The default state of most law firm Microsoft 365 tenants, built over years of organic SharePoint growth and loose sharing settings, is not ready for an AI that can traverse the entire corpus simultaneously. This guide walks through every control layer you need to address before a licensed seat goes live.
The core governance risk
M365 Copilot Premium respects existing permissions — it does not create new ones. But that is precisely the problem. Industry assessments consistently find that 60–80% of enterprise SharePoint sites have at least one oversharing vulnerability. In a law firm context, that means M365 Copilot Premium could surface Client A’s documents in a response while an attorney is working on Client B’s matter — not because Copilot is insecure, but because the permissions were configured before AI could search across everything simultaneously.
When a licensed user submits a prompt in M365 Copilot Premium, the request is sent to the Microsoft Graph — the API layer that indexes the user’s Microsoft 365 data including SharePoint files, Exchange email, Teams conversations, OneNote notebooks, and calendar events. The Graph returns results that the licensed user has at least view permission to access. Copilot then synthesizes those results into a response.
Three things are critical to understand about Microsoft Graph data access in this context:
What the Microsoft Graph data access boundary means for your DPA
All processing occurs within the Microsoft 365 service boundary. Prompts and responses are not used to train foundation LLMs. Microsoft acts as a data processor under your commercial agreement. As of January 7, 2026, Anthropic is a Microsoft sub processor for Copilot — meaning Claude-powered interactions within Copilot operate under the same contractual data protection framework as OpenAI-powered interactions. Your existing Microsoft 365 Data Protection Addendum covers both.
The most important pre-deployment step for any law firm is a comprehensive audit of SharePoint Copilot permissions — specifically identifying sites where content is accessible to more people than it should be. This is not a new governance problem, but M365 Copilot Premium makes its consequences immediate and visible.
Microsoft provides built-in oversharing visibility through SharePoint Advanced Management (SAM), included with M365 Copilot licenses. Start here:
SharePoint Admin Center → Reports → Data access governance → Sites shared with Everyone or EEEU
Run two specific reports: the Site Permissions for the Organization report (baseline of who has permissions across all sites) and the Sharing Links report (identifies sites with “Anyone” or organization-wide links that bypass membership controls). Export both and sort by permission breadth before you license a single Premium seat.
The “Everyone Except External Users” trap
The single most common oversharing pattern in law firm tenants is sites or document libraries granted to the Everyone Except External Users (EEEU) group. This grants access to every person in the organization. When M365 Copilot Premium traverses the Graph for a licensed user, every EEEU-accessible document becomes part of their AI-accessible corpus — including documents from other practice groups, other client matters, and administrative functions the attorney has no business reading. Identify and remediate every EEEU grant before deploying Premium licenses.
Once overshared sites are identified, three mechanisms are available to restrict SharePoint Copilot permissions without disrupting existing workflows:
SharePoint Admin Center → Active sites → [Site] → Settings → Restricted content discoverability
SharePoint Admin Center → Active sites → [Site] → Settings → Restricted access control
SharePoint Admin Center → Settings → Restricted SharePoint Search
Oversharing accumulates over time because of ownerless sites — SharePoint sites where the original creator has left the firm and no one has maintained permissions since. Before deploying M365 Copilot Premium, run a Site Ownership Policy in SAM to identify any site without at least two active owners:
SharePoint Admin Center → Policies → Site lifecycle management → Site ownership policy → Run in Simulation mode
Run in simulation mode first to identify scope, then switch to active mode to notify ownership candidates. Ownerless sites that have been inactive for 12+ months should be archived or deleted — stale content degrades Copilot response quality as well as creating compliance risk.
Copilot sensitivity labels from Microsoft Purview are the deepest and most precise governance control available for M365 Copilot Premium. They work at the file level, travel with the document regardless of where it is stored, and directly govern whether Copilot can extract content from a document — even for a user who has view permissions.
When a sensitivity label applies encryption via Microsoft Purview Information Protection, Copilot checks two specific usage rights before processing the document: VIEW and EXTRACT. A user must have both VIEW and EXTRACT rights for Copilot to interact with encrypted content. If the EXTRACT right is absent, Copilot cannot read or summarize the document — even if the user can open it manually in Word.
This is the mechanism that gives you matter-level Copilot control through labeling. The built-in permission roles and their Copilot implications are:

One of the most important behaviors of Copilot sensitivity labels is inheritance. When M365 Copilot Premium generates new content — a Word draft, a PowerPoint presentation, a Copilot Page — it inherits the highest-priority sensitivity label from the source documents used to create it. If an attorney uses Copilot in Word to draft a letter based on a file labeled “Client Confidential,” the resulting draft document is automatically labeled “Client Confidential” with all of that label’s protection settings applied.
Two configuration steps are required for inheritance to work correctly:
Microsoft Purview portal → Information Protection → Sensitivity labels → Turn on sensitivity labels for Office files in SharePoint and OneDrive.
For a law firm deploying M365 Copilot Premium across an established SharePoint environment, manual labeling of existing documents is not viable at scale. Microsoft Purview auto-labeling policies use trainable classifiers and sensitive information types to detect and label content automatically:
Microsoft Purview portal → Information Protection → Auto-labeling policies → Create policy
Law firm-relevant sensitive information types to configure as auto-labeling triggers include: Social Security Numbers, Client Matter Numbers (custom SIT), attorney-client privilege markers, financial account numbers, and any firm-specific document headers or footers that indicate confidential status. Run policies in simulation mode first to assess volume and accuracy before switching to enforcement.
Important: “data in use” exception
The EXTRACT right controls Copilot access to documents at rest. If a document is already open in Word or another Office app on the user’s device, Copilot can interact with it as “data in use” regardless of the EXTRACT right on the label. This is by design. The practical implication: EXTRACT-blocked labels prevent Copilot from retrieving documents from SharePoint in the background during a Graph-grounded prompt, but do not prevent a user from manually opening a document and asking Copilot to help them with it in the active app.
Deploying M365 Copilot Premium without configuring Microsoft Purview Copilot audit capabilities is the equivalent of installing email without retention policies. Every prompt an attorney sends, every document Copilot references, and every response generated is subject to your firm’s eDiscovery obligations, legal hold requirements, and bar association data handling rules.
Understanding what each layer captures — and what it does not — prevents gaps in your compliance posture:
| Purview tool | What it captures | What it does not capture | Admin path |
|---|---|---|---|
| Unified Audit Log | Metadata: who used Copilot, which app, when, which files were referenced, sensitivity labels on those files | Actual prompt text or response text — metadata only | Purview portal → Audit Operation: CopilotInteraction |
| eDiscovery (Content Search / Premium) | Full prompt and response text, stored in user mailbox hidden folder alongside Teams chat data | Admin config changes to Copilot settings; device identity information | Purview portal → eDiscovery Search type: Copilot interactions |
| Communication Compliance | Policy-based monitoring of prompt and response content — can flag sensitive data disclosure, inappropriate content, or policy violations | Does not apply DLP-style blocking; detection and review only | Purview portal → Communication compliance |
The Unified Audit Log is the first visibility tool most IT teams will use. Navigate to:
Microsoft Purview portal → Audit → Search tab → Activities – operation names → Enter: CopilotInteraction
Filter by user, date range, and optionally by the app where the interaction occurred (Word, Excel, Teams, etc.). The results show which files were referenced during each interaction and whether those files carried sensitivity labels. This data is your operational baseline for answering the question: “what is Copilot actually being used for, and what data is it touching?”
The Audit Log records that an interaction occurred but not its content. For full text — prompts and responses — you need eDiscovery. Copilot interactions are stored in a hidden folder in the user’s Exchange mailbox, indexed identically to Teams chat messages. To retrieve them:
Purview portal → eDiscovery → Content Search → Add condition: Type = Copilot interactions
Scope the search to specific user mailboxes and date ranges. For legal hold scenarios, add the user’s mailbox to an eDiscovery hold before the search to preserve interaction data in place. Note that the eDiscovery Premium case is required if you need to permanently delete specific Copilot interactions — for example, in response to a subject access request under GDPR or a court order.
Attorney-client privilege and Copilot prompts
Copilot prompts can reveal attorney mental impressions and case strategy — content that would ordinarily be protected as work product under Federal Rules of Civil Procedure Rule 26(b)(3). Because prompts are stored in the user’s mailbox and are discoverable via eDiscovery, your firm should address this in its AI usage policy: when must attorneys sanitize prompts before submission? How should Copilot-generated draft materials be documented for privilege logs? These are policy questions, not technical ones — but the technical infrastructure for discovery is already active the moment you deploy a Premium license.
For firms deploying M365 Copilot Premium at scale, DSPM for AI in Microsoft Purview provides a consolidated dashboard for AI-specific security posture. It surfaces policy recommendations tailored to your tenant’s current label coverage, identifies risky AI usage patterns, and provides a single view of sensitivity label inheritance across Copilot-generated content:
Microsoft Purview portal → Data Security Posture Management → AI overview
Run the DSPM assessment before your first Premium deployment and revisit it quarterly. It will surface the specific oversharing risks and labeling gaps most relevant to your tenant — more actionable than a generic checklist.
For law firms with ethical wall requirements — matters where attorneys on one side of a transaction cannot access information held by the firm on the other side — Microsoft Purview Information Barriers extend those walls into M365 Copilot Premium’s Graph-grounded responses.
Information Barriers (IB) create policy-based segments that prevent communication and data access between defined groups. When IB policies are active, a user in Segment A cannot have Segment B’s content surfaced by Copilot, even if they have technically been granted SharePoint permissions to that content. This is the technical enforcement mechanism for ethical walls that previously relied on human compliance.
Microsoft Purview portal → Information barriers → Policies → Create policy
Information Barriers require Microsoft 365 E5 Compliance or equivalent. They are configured at the Entra security group level — you define segments (e.g., “M&A Team — Matter 7823”) and then create IB policies that block or allow interactions between segments. The policies apply across Teams, SharePoint, OneDrive, Exchange — and by extension, across all Copilot Graph-grounded responses.
Recommended configuration for law firms
Start with IB policies for your highest-conflict-risk matter types: M&A transactions, bankruptcy representations with multiple creditor classes, and government investigations where access to information needs to be demonstrably segmented. Do not attempt to implement IB policies for every matter simultaneously — the initial configuration overhead is significant. A phased approach by matter type, starting with active conflicts, is more sustainable than a comprehensive rollout that stalls before completion.
The controls described in this guide should be implemented in sequence — not all at once. The following phased approach aligns governance readiness with license deployment:
The controls in this guide are not theoretical precautions. They are the difference between deploying an AI that surfaces the right documents to the right attorneys, and deploying one that collapses years of carefully maintained matter confidentiality in a single prompt response. The Microsoft Graph is powerful precisely because it knows everything a user can access — and M365 Copilot Premium uses that power in every interaction.
The good news is that every control described here — SharePoint Copilot permissions through DAG reports and RAC, Copilot sensitivity labels with EXTRACT right management, Microsoft Purview Copilot audit and eDiscovery, and Information Barriers for ethical walls — is available within your existing Microsoft 365 licensing if you are on E3 or E5. The investment is not in new tools. It is in configuring the tools you already have before the AI starts traversing the corpus.
The investment isn’t in new tools, it’s in the expertise to configure them correctly; let Cocha Technology help you activate your ‘Fortress’ settings so your firm can scale with AI without compromising confidentiality—click here to connect with our advisory team.
Call or email Cocha. We can help with your cybersecurity needs!