M365 Copilot Premium: Access Control and Data Protection for Law Firms

The paid tier connects to everything your firm has ever stored. That is its power — and its governance challenge. Here is the complete control framework before you deploy a single licensed seat.

Modern graphic showing legal professionals in a Houston conference room using "M365 Copilot Premium." A large display highlights "Access Control" with MFA and permissions, and "Data Protection" with encryption and DLP, featuring the Cocha Technology logo.

The first two parts of this series covered the licensing landscape and the controls for Copilot Chat Basic. This post is about what happens when you deploy M365 Copilot Premium — the paid tier at $30 per user per month (or $21 for firms under 300 users). The governance stakes are categorically higher. Where Copilot Chat Basic is web-grounded and sees only what a user manually provides, M365 Copilot Premiumconnects to your entire Microsoft 365 tenant via the Microsoft Graph. It can reason over every email, every Teams conversation, every document in SharePoint and OneDrive that the licensed user has permission to access. That sentence should stop any CIO in their tracks before sign-off.

This is not an argument against deploying M365 Copilot Premium. The productivity case is real and the data protection framework is robust — but only if it is configured. The default state of most law firm Microsoft 365 tenants, built over years of organic SharePoint growth and loose sharing settings, is not ready for an AI that can traverse the entire corpus simultaneously. This guide walks through every control layer you need to address before a licensed seat goes live.

The core governance risk

M365 Copilot Premium respects existing permissions — it does not create new ones. But that is precisely the problem. Industry assessments consistently find that 60–80% of enterprise SharePoint sites have at least one oversharing vulnerability. In a law firm context, that means M365 Copilot Premium could surface Client A’s documents in a response while an attorney is working on Client B’s matter — not because Copilot is insecure, but because the permissions were configured before AI could search across everything simultaneously.

How Microsoft Graph data access works in M365 Copilot Premium

When a licensed user submits a prompt in M365 Copilot Premium, the request is sent to the Microsoft Graph — the API layer that indexes the user’s Microsoft 365 data including SharePoint files, Exchange email, Teams conversations, OneNote notebooks, and calendar events. The Graph returns results that the licensed user has at least view permission to access. Copilot then synthesizes those results into a response.

Three things are critical to understand about Microsoft Graph data access in this context:

What the Microsoft Graph data access boundary means for your DPA

All processing occurs within the Microsoft 365 service boundary. Prompts and responses are not used to train foundation LLMs. Microsoft acts as a data processor under your commercial agreement. As of January 7, 2026, Anthropic is a Microsoft sub processor for Copilot — meaning Claude-powered interactions within Copilot operate under the same contractual data protection framework as OpenAI-powered interactions. Your existing Microsoft 365 Data Protection Addendum covers both.

SharePoint Copilot permissions: the oversharing audit you must run first

The most important pre-deployment step for any law firm is a comprehensive audit of SharePoint Copilot permissions — specifically identifying sites where content is accessible to more people than it should be. This is not a new governance problem, but M365 Copilot Premium makes its consequences immediate and visible.

Step 1 — Run Data Access Governance reports in SharePoint Admin Center

Microsoft provides built-in oversharing visibility through SharePoint Advanced Management (SAM), included with M365 Copilot licenses. Start here:

 

SharePoint Admin Center → Reports → Data access governance → Sites shared with Everyone or EEEU

Run two specific reports: the Site Permissions for the Organization report (baseline of who has permissions across all sites) and the Sharing Links report (identifies sites with “Anyone” or organization-wide links that bypass membership controls). Export both and sort by permission breadth before you license a single Premium seat.

The “Everyone Except External Users” trap

The single most common oversharing pattern in law firm tenants is sites or document libraries granted to the Everyone Except External Users (EEEU) group. This grants access to every person in the organization. When M365 Copilot Premium traverses the Graph for a licensed user, every EEEU-accessible document becomes part of their AI-accessible corpus — including documents from other practice groups, other client matters, and administrative functions the attorney has no business reading. Identify and remediate every EEEU grant before deploying Premium licenses.

Step 2 — Choose the right SharePoint access restriction mechanism

Once overshared sites are identified, three mechanisms are available to restrict SharePoint Copilot permissions without disrupting existing workflows:

 
A.  Restricted Content Discoverability (RCD) — fastest deployment
 
SharePoint Admin Center → Active sites → [Site] → Settings → Restricted content discoverability
 
Prevents a site’s content from appearing in Copilot responses and organization-wide search without changing user permissions. The site owner can still access the content directly; it simply cannot be surfaced by Copilot or Search. This is the fastest remediation for high-risk sites while a full permission review is underway. Treat it as a temporary control — the long-term goal is correct permissions, not permanent RCD.
 
B.  Restricted Access Control (RAC) — matter-level isolation
 
SharePoint Admin Center → Active sites → [Site] → Settings → Restricted access control
 
Limits site access to a defined Microsoft 365 Group or Entra security group. Even if a user has been granted broader permissions elsewhere in the tenant, RAC ensures only explicit group members can access the site — and therefore only they can have it surfaced by Copilot. This is the right control for matter-specific SharePoint sites in a law firm. Pair it with a matter-naming convention for the security groups to make ongoing management tractable.
 
C.  Restricted SharePoint Search (RSS) — tenant-wide temporary gate
 
SharePoint Admin Center → Settings → Restricted SharePoint Search
 
Creates an allowlist of reviewed, approved sites that Copilot and Search can access. All other sites are excluded until explicitly added. This is a useful temporary control during a phased deployment — start with 50–100 low-risk sites, validate the experience, then expand the list as permission reviews are completed. RSS is a deployment gate, not a permanent governance solution. Disable it once permission hygiene is in place.

Step 3 — Site ownership and lifecycle policies

Oversharing accumulates over time because of ownerless sites — SharePoint sites where the original creator has left the firm and no one has maintained permissions since. Before deploying M365 Copilot Premium, run a Site Ownership Policy in SAM to identify any site without at least two active owners:

 
SharePoint Admin Center → Policies → Site lifecycle management → Site ownership policy → Run in Simulation mode

Run in simulation mode first to identify scope, then switch to active mode to notify ownership candidates. Ownerless sites that have been inactive for 12+ months should be archived or deleted — stale content degrades Copilot response quality as well as creating compliance risk.

Copilot sensitivity labels: the most surgical control in your stack

Copilot sensitivity labels from Microsoft Purview are the deepest and most precise governance control available for M365 Copilot Premium. They work at the file level, travel with the document regardless of where it is stored, and directly govern whether Copilot can extract content from a document — even for a user who has view permissions.

The EXTRACT usage right: the mechanism that matters

When a sensitivity label applies encryption via Microsoft Purview Information Protection, Copilot checks two specific usage rights before processing the document: VIEW and EXTRACTA user must have both VIEW and EXTRACT rights for Copilot to interact with encrypted content. If the EXTRACT right is absent, Copilot cannot read or summarize the document — even if the user can open it manually in Word.

This is the mechanism that gives you matter-level Copilot control through labeling. The built-in permission roles and their Copilot implications are:

A diamond-shaped diagram illustrating four user roles—Co-Owner, Reviewer, Viewer, and Custom—and their impact on Copilot's ability to access or extract document content.

Label inheritance: how Copilot-generated content acquires sensitivity

One of the most important behaviors of Copilot sensitivity labels is inheritance. When M365 Copilot Premium generates new content — a Word draft, a PowerPoint presentation, a Copilot Page — it inherits the highest-priority sensitivity label from the source documents used to create it. If an attorney uses Copilot in Word to draft a letter based on a file labeled “Client Confidential,” the resulting draft document is automatically labeled “Client Confidential” with all of that label’s protection settings applied.

Two configuration steps are required for inheritance to work correctly:

  • Enable sensitivity labels for SharePoint and OneDrive. This is a prerequisite — without it, Copilot can only access encrypted files when they are already open in an Office app (data in use). Navigate to: 

Microsoft Purview portal → Information Protection → Sensitivity labels → Turn on sensitivity labels for Office files in SharePoint and OneDrive.

  • Configure encryption at the label level, not just the site level. Container labels applied to SharePoint sites do not propagate to individual files within those sites. A document inside a site labeled “Confidential” does not automatically inherit that label — it must be labeled at the item level through manual labeling, default library labels, or auto-labeling policies.

Auto-labeling: scaling sensitivity classification across the corpus

For a law firm deploying M365 Copilot Premium across an established SharePoint environment, manual labeling of existing documents is not viable at scale. Microsoft Purview auto-labeling policies use trainable classifiers and sensitive information types to detect and label content automatically:

 
Microsoft Purview portal → Information Protection → Auto-labeling policies → Create policy

Law firm-relevant sensitive information types to configure as auto-labeling triggers include: Social Security Numbers, Client Matter Numbers (custom SIT), attorney-client privilege markers, financial account numbers, and any firm-specific document headers or footers that indicate confidential status. Run policies in simulation mode first to assess volume and accuracy before switching to enforcement.

Important: “data in use” exception

The EXTRACT right controls Copilot access to documents at rest. If a document is already open in Word or another Office app on the user’s device, Copilot can interact with it as “data in use” regardless of the EXTRACT right on the label. This is by design. The practical implication: EXTRACT-blocked labels prevent Copilot from retrieving documents from SharePoint in the background during a Graph-grounded prompt, but do not prevent a user from manually opening a document and asking Copilot to help them with it in the active app.

Microsoft Purview Copilot: audit, eDiscovery, and Communication Compliance

Deploying M365 Copilot Premium without configuring Microsoft Purview Copilot audit capabilities is the equivalent of installing email without retention policies. Every prompt an attorney sends, every document Copilot references, and every response generated is subject to your firm’s eDiscovery obligations, legal hold requirements, and bar association data handling rules.

The three layers of Purview coverage for Copilot interactions

Understanding what each layer captures — and what it does not — prevents gaps in your compliance posture:

Purview toolWhat it capturesWhat it does not captureAdmin path
Unified Audit LogMetadata: who used Copilot, which app, when, which files were referenced, sensitivity labels on those filesActual prompt text or response text — metadata onlyPurview portal → Audit
Operation: CopilotInteraction
eDiscovery (Content Search / Premium)Full prompt and response text, stored in user mailbox hidden folder alongside Teams chat dataAdmin config changes to Copilot settings; device identity informationPurview portal → eDiscovery
Search type: Copilot interactions
Communication CompliancePolicy-based monitoring of prompt and response content — can flag sensitive data disclosure, inappropriate content, or policy violationsDoes not apply DLP-style blocking; detection and review onlyPurview portal → Communication compliance

Running an audit log search for Copilot interactions

The Unified Audit Log is the first visibility tool most IT teams will use. Navigate to:

 
Microsoft Purview portal → Audit → Search tab → Activities – operation names → Enter: CopilotInteraction

Filter by user, date range, and optionally by the app where the interaction occurred (Word, Excel, Teams, etc.). The results show which files were referenced during each interaction and whether those files carried sensitivity labels. This data is your operational baseline for answering the question: “what is Copilot actually being used for, and what data is it touching?”

Retrieving full prompt and response text via eDiscovery

The Audit Log records that an interaction occurred but not its content. For full text — prompts and responses — you need eDiscovery. Copilot interactions are stored in a hidden folder in the user’s Exchange mailbox, indexed identically to Teams chat messages. To retrieve them:

 
Purview portal → eDiscovery → Content Search → Add condition: Type = Copilot interactions
 

Scope the search to specific user mailboxes and date ranges. For legal hold scenarios, add the user’s mailbox to an eDiscovery hold before the search to preserve interaction data in place. Note that the eDiscovery Premium case is required if you need to permanently delete specific Copilot interactions — for example, in response to a subject access request under GDPR or a court order.

Attorney-client privilege and Copilot prompts

Copilot prompts can reveal attorney mental impressions and case strategy — content that would ordinarily be protected as work product under Federal Rules of Civil Procedure Rule 26(b)(3). Because prompts are stored in the user’s mailbox and are discoverable via eDiscovery, your firm should address this in its AI usage policy: when must attorneys sanitize prompts before submission? How should Copilot-generated draft materials be documented for privilege logs? These are policy questions, not technical ones — but the technical infrastructure for discovery is already active the moment you deploy a Premium license.

Data Security Posture Management (DSPM) for AI

For firms deploying M365 Copilot Premium at scale, DSPM for AI in Microsoft Purview provides a consolidated dashboard for AI-specific security posture. It surfaces policy recommendations tailored to your tenant’s current label coverage, identifies risky AI usage patterns, and provides a single view of sensitivity label inheritance across Copilot-generated content:

 
Microsoft Purview portal → Data Security Posture Management → AI overview

 

Run the DSPM assessment before your first Premium deployment and revisit it quarterly. It will surface the specific oversharing risks and labeling gaps most relevant to your tenant — more actionable than a generic checklist.

Information barriers: preventing cross-matter data exposure

For law firms with ethical wall requirements — matters where attorneys on one side of a transaction cannot access information held by the firm on the other side — Microsoft Purview Information Barriers extend those walls into M365 Copilot Premium’s Graph-grounded responses.

Information Barriers (IB) create policy-based segments that prevent communication and data access between defined groups. When IB policies are active, a user in Segment A cannot have Segment B’s content surfaced by Copilot, even if they have technically been granted SharePoint permissions to that content. This is the technical enforcement mechanism for ethical walls that previously relied on human compliance.

 
Microsoft Purview portal → Information barriers → Policies → Create policy
 

Information Barriers require Microsoft 365 E5 Compliance or equivalent. They are configured at the Entra security group level — you define segments (e.g., “M&A Team — Matter 7823”) and then create IB policies that block or allow interactions between segments. The policies apply across Teams, SharePoint, OneDrive, Exchange — and by extension, across all Copilot Graph-grounded responses.

Recommended configuration for law firms

Start with IB policies for your highest-conflict-risk matter types: M&A transactions, bankruptcy representations with multiple creditor classes, and government investigations where access to information needs to be demonstrably segmented. Do not attempt to implement IB policies for every matter simultaneously — the initial configuration overhead is significant. A phased approach by matter type, starting with active conflicts, is more sustainable than a comprehensive rollout that stalls before completion.

A phased deployment framework: the right sequence for law firm CIOs

The controls described in this guide should be implemented in sequence — not all at once. The following phased approach aligns governance readiness with license deployment:

1.  Baseline — before any Premium licenses go live
Run SharePoint DAG permission reports. Identify all EEEU grants and “Anyone” sharing links. Enable Restricted SharePoint Search as a temporary gate. Run a Site Ownership Policy in simulation mode. Confirm Unified Audit Log is enabled (it is, by default, for E3/E5 tenants — but verify retention is set to at least 180 days).
 
2.  Pilot — 10–25 users, low-risk practice group
Deploy Premium licenses to a pilot group with clean SharePoint permissions. Enable sensitivity labels for SharePoint and OneDrive. Configure auto-labeling policies in simulation mode. Run weekly Audit Log reviews for the pilot group. Assess what content Copilot is surfacing and whether it aligns with your intent. Build your “three-sentence brief” for attorneys (what Copilot can and cannot see) before broader rollout.
 
3.  Remediation — concurrent with pilot
Apply Restricted Content Discoverability to high-risk sites identified in Step 1. Configure Restricted Access Control for matter-specific SharePoint sites. Begin applying default sensitivity labels to document libraries by matter type. Switch auto-labeling policies from simulation to enforcement for the most clearly defined sensitive information types (SSNs, client matter numbers).
 
4.  Broad deployment — with governance in place
Disable Restricted SharePoint Search as permission hygiene is validated. Deploy Premium licenses to the next groups, starting with attorneys whose SharePoint footprint has been reviewed. Configure Communication Compliance policies for Copilot interactions. Set up DSPM for AI in Purview as the ongoing posture monitoring tool. Begin Information Barriers configuration for high-conflict-risk matter types.
 
5.  Operate — ongoing governance cadence
Monthly: review Copilot Audit Log summaries and DSPM for AI posture alerts. Quarterly: re-run SharePoint permission state reports and SAM site lifecycle reports. Annually: re-assess IB policy coverage against current matter portfolio and conflict register. On every attorney departure: review and remediate SharePoint access that was granted during their tenure.

The governance reality of M365 Copilot Premium

The controls in this guide are not theoretical precautions. They are the difference between deploying an AI that surfaces the right documents to the right attorneys, and deploying one that collapses years of carefully maintained matter confidentiality in a single prompt response. The Microsoft Graph is powerful precisely because it knows everything a user can access — and M365 Copilot Premium uses that power in every interaction.

The good news is that every control described here — SharePoint Copilot permissions through DAG reports and RAC, Copilot sensitivity labels with EXTRACT right management, Microsoft Purview Copilot audit and eDiscovery, and Information Barriers for ethical walls — is available within your existing Microsoft 365 licensing if you are on E3 or E5. The investment is not in new tools. It is in configuring the tools you already have before the AI starts traversing the corpus.

The investment isn’t in new tools, it’s in the expertise to configure them correctly; let Cocha Technology help you activate your ‘Fortress’ settings so your firm can scale with AI without compromising confidentiality—click here to connect with our advisory team.