AI Assistant
As of 2025, whether Microsoft Copilot uses your chat data for training depends on the account type and context.
For enterprise and organizational users (such as those with Microsoft 365 commercial or educational accounts), Microsoft states that prompts, responses, and data accessed through Microsoft Graph are not used to train the underlying foundation of large language models. This protection is part of Microsoft’s enterprise data protection commitments, ensuring organizational data privacy by default. Users within these environments have their chat data excluded from model training. More details are available in the Microsoft 365 Copilot privacy documentation.
For consumer or personal Microsoft accounts, Microsoft may use chat data to improve and train AI models, but with safeguards in place. Personal identifiers are removed from the data, and users are given clear opt-out controls to prevent their data from being used in training. Model training on consumer data does not start until after users have been notified and given a chance to opt out. Additionally, certain categories of users (e.g., organizational Entra ID accounts, Microsoft 365 Personal or Family subscriptions) and some data types (such as the contents of files uploaded to Copilot) are excluded from training. More information can be found in Microsoft’s Transparency and Control in Consumer Data Use blog and the Privacy FAQ for Microsoft Copilot.
For enterprise or work, Microsoft 365 Copilot, the tool accesses content and context via Microsoft Graph, such as files, emails, chats, and calendars, but only those you have permission to view. This data access respects all organizational security and privacy policies. More details are available on Microsoft’s official Microsoft 365 Copilot privacy & protections page.
For the browser version of Copilot in Microsoft Edge, if you enable “Allow Microsoft to access page content,” Copilot can read the visible content of the open web page or PDF to help generate answers. This is documented on the Microsoft 365 Copilot Chat privacy & protections page. Microsoft further clarifies that sensitive elements such as user or tenant identifiers, entire uploaded files, and entire webpages or PDFs summarized by Copilot Chat in Edge are excluded from what is sent to the web search grounding service.
Microsoft does not publicly specify whether Copilot accesses embedded or hidden page elements, such as iframes, shadow DOM, or form fields, before submission. Access to local files on your device that are not explicitly uploaded is also not documented, so these should be assumed inaccessible.
Copilot can access visible text content you permit it to see, as well as organizational data sources in enterprise contexts. However, access to hidden or local artifacts is not documented.
Based on current publicly documented Microsoft sources, Copilot’s capabilities include generating content, summarizing, uploading/downloading files you provide, and processing pages you allow. However, Microsoft does not clearly document that Copilot autonomously “clicks buttons, types or submits forms, or reads the system clipboard” across all contexts.
For Copilot in enterprise, it uses files you open or upload, and the service stores prompts/responses, as outlined in Microsoft 365 Copilot privacy & protections.
For Copilot in Edge, it may request page-content context (visible page) to process, as detailed in Microsoft 365 Copilot Chat privacy & protections.
However, Microsoft has no published page that states “Copilot will click a link for you in your browser, or access your system clipboard.” Therefore, if you intend to rely on those capabilities, you should verify specific feature documentation or ask Microsoft support.
Copilot may process files/prompt/page-content you give it, but the ability to autonomously interact with UI elements or the system clipboard is not clearly confirmed in published Microsoft documentation.
Microsoft documents many metadata types but does not publish a full, exhaustive list of everything captured (e.g., selectors or deeply hidden identifiers).
In the enterprise Microsoft 365 Copilot documentation, it states: “When a user interacts … we store data about these interactions. The stored data includes the user's prompt and Copilot's response, including citations … We refer to the user’s prompt and Copilot’s response … as the ‘content of interactions’ and the record … is the user’s Copilot activity history.” This is detailed in Microsoft 365 Copilot privacy & protections.
For web queries, the documentation clarifies that generated search queries sent to Bing exclude user/tenant identifiers and entire prompts if they are long, as explained in Microsoft 365 Copilot Chat privacy & protections.
The metadata captured thus includes: user prompt, response, perhaps file/context names and citations, timestamp (by “activity history”), but not necessarily selectors/shadow‑DOM references.
For Microsoft 365 Copilot / Copilot Chat, data (prompts/responses) is stored within the Microsoft 365 service boundary (e.g., in hidden folders of Exchange mailboxes) and leverages the same compliance, encryption, and retention frameworks as other Microsoft 365 content.
For example, the hidden mailbox folder “SubstrateHolds” is referenced in the retention‑policy article for Copilot/AI apps. Microsoft gives guidance in the "Learn about retention for Copilot and AI apps" page: If a retention policy is set (retain only or delete only), messages are moved to the SubstrateHolds folder for at least 1 day and permanently deleted typically within 1-7 days after the retention period expires.
The article on audit log retention policy shows that audit logs (not necessarily content logs) can be retained up to 10 years, depending on policy/licensing.
Retention controls: Administrators use the Microsoft Purview portal to define retention policies for Copilot/AI apps. Such policies control when content/messaging is deleted or retained. For example, the "Retention for Copilot and AI apps" page describes how messages are handled (SubstrateHolds folder, etc.).
Export controls/audit logs: Admins can access Copilot activity logs via Microsoft Purview Audit and use the Office 365 Management Activity API to export logs to SIEM tools. The "Audit Copilot Studio activities" article shows that logs are searchable via Purview.
Delete controls (user side): For Microsoft 365 Copilot/Chat, users can delete their personal activity history. Microsoft states that this is aligned with how other Microsoft 365 interactions are handled.
Hidden mailbox folder storage: Prompt/response data and Copilot activity are stored in hidden folders and are subject to retention/deletion policies like any other mailbox item, as noted in the Microsoft 365 Blog.
For the enterprise Microsoft 365 Copilot / Copilot Chat scenario, Microsoft clearly states that customer prompts/responses are not used to train foundation models by default.
If your organization has specific agents or special settings, you should verify “optional data sharing” settings for feedback/training.
You can still keep prompts/responses logged, audited, and retained for compliance (via Purview) even though they are excluded from training. The retention policies described allow capturing and retention for audit/eDiscovery, while training exclusion remains.
For consumer Copilot, while Microsoft’s FAQ suggests personal data is anonymized when used for training/improvement, there is less clarity on how you can opt out specifically from training, but retain history logs. The relevant consumer privacy hub may allow toggling feedback, but full documentation is limited.
For Enterprise/Work (Microsoft 365 Copilot / Copilot Chat), Microsoft explicitly excludes the prompts/responses and Graph‑accessed data from being used to train foundation models. This is stated in the Microsoft 365 Copilot privacy & protections.
For personal accounts, the FAQ implies anonymized data may be used for product improvement but does not guarantee “never training” in the same explicit language, as outlined in the Privacy FAQ for Microsoft Copilot (consumer).
In terms of retention and compliance, Enterprise/Work accounts have access to retention, export, audit, and eDiscovery features through Microsoft Purview, including hidden mailbox folders. Personal accounts, however, have fewer enterprise-grade controls.
Regarding default permissions and policy controls, in enterprise scenarios, administrators can turn on/off features, restrict web search, define which users or groups get Copilot, disable page context, enforce DLP policies, etc. For personal accounts, the user is the admin of their own account, with limited delegation and policy options. Microsoft makes a clear distinction between “Work/Edu” and “personal” plans, as explained in Microsoft Learn.
For Gov clouds and data residency, Microsoft mentions that data residency commitments apply for Microsoft 365 Copilot, including the Data Protection Addendum and EU data boundary. Some features in Gov clouds may differ, such as web-search defaults.
Enterprise/Work/Gov accounts have stronger controls, audit/governance, and explicit non‑training commitments, while personal accounts have fewer controls, with training/usage policies that may differ and less enterprise compliance tooling.
For the enterprise Microsoft 365 Copilot / Copilot Chat scenario, Microsoft states that when web search queries are generated, they send a derived search query (not the full prompt) to the Bing search service. The derived query excludes user/tenant identifiers and the full prompt unless the prompt is very short, as explained in Microsoft 365 Copilot privacy & protections.
Regarding on‑device vs cloud, Microsoft indicates that Copilot uses Azure OpenAI services and runs inside Microsoft’s cloud infrastructure, meaning the processing is cloud-based. For consumer Copilot in Edge, if “Allow access to page content” is enabled, page content is sent to Microsoft services. There is no public statement confirming that the service runs fully on devices without cloud calls for most features, as mentioned in the Privacy & protections for Copilot Chat.
For domains/endpoints contacted, Microsoft’s documentation for Copilot does not publish a full list of endpoints for feature calls. The "Manage Microsoft 365 Copilot" article advises not to block network traffic via the firewall but to use service configuration instead, suggesting that domains/endpoints are part of the Microsoft 365 service boundary.
For the browser (Edge) experience, Microsoft states that users and admins can disable or enable “page content access” for Copilot. For example, the “Allow Microsoft to access page content” toggle is shown to users, and admins can disable this via policy. This is explained in Microsoft 365 Copilot privacy & protections.
For enterprise scenarios, admins can use policy to disable the Copilot sidebar, restrict which users/groups get Copilot, restrict web-search policy, or disable agents that access organizational content. This functionality is described in Privacy & protections for Copilot Chat.
Microsoft advises that you “do not rely on network-filtering to block Copilot; instead use built-in service controls,” which suggests explicit allow/deny lists via the Microsoft 365 admin center.
For Microsoft 365 Copilot / Copilot Chat (enterprise), Microsoft provides controls in the admin center: You can control “Pin/Unpin Copilot,” “Enable/Disable web search,” manage which agents are allowed, define which users/groups have access, define log retention, etc.
For the Edge browser version, there are Group Policy Objects (GPO) settings or ADMX/MDM configurations for controlling the Copilot sidebar and controlling page context (e.g., policy name EdgeEntraCopilotPageContext or CopilotPageContext). While I don’t have the exact policy name publicly quoted here, Microsoft’s manage article references admin controls. More information is available in the FAQ for Copilot Chat.
Central configuration: Microsoft encourages the use of built‑in controls rather than network blocking and recommends using service settings to configure user access, feature rollout, feature pinning, etc.
Regarding incognito/InPrivate behavior, Microsoft does not appear to publish a dedicated policy or settings page specific to “Incognito/InPrivate” mode and how Copilot behaves there in full detail.
Microsoft documents that audit logs for Copilot‑Studio activities are captured and accessible via the Microsoft Purview compliance portal (Audit). Admins can search/filter these events and export them via PowerShell/CSV. This functionality is detailed in the "Audit Copilot Studio activities" page.
The audit log retention policy page shows that audit logs in Microsoft 365 can be retained for up to 10 years (depending on the license) and can be exported to SIEM systems via APIs. This is explained in the "Audit log retention policy" article.
Additionally, prompt/response logs (for Microsoft 365 Copilot/Chat) are stored in the service and are subject to eDiscovery/retention. More information is available in Microsoft 365 Copilot privacy & protections.
The enterprise Microsoft 365 Copilot / Copilot Chat version is distinguished by the use of organizational data (via Microsoft Graph) as grounding for responses. In this scenario, prompts/responses are not used to train foundation models, and it benefits from enterprise-grade retention, audit, and compliance infrastructure through Purview.
The web/app/consumer version (personal Copilot) may use different telemetry/training policies; Microsoft’s FAQ indicates that personal interactions may be used for product improvement, albeit anonymized.
For API usage (e.g., Azure OpenAI or Copilot Connectors), those services might have distinct terms. For example, the Azure OpenAI service uses customer prompts but may allow opt‑in/out of data sharing. Full mapping of how these differ in every scenario is not available.
Therefore, behavior differs depending on the plan (Enterprise vs Consumer) in how data is used/trained, how logs are handled, and what data sources Copilot can access. If you use an API or third‑party connector, you should verify the terms for that specific service.
Microsoft describes certain protections for enterprise Copilot/Chat/training, though it doesn’t publish a full “prompt‑injection safe mode” table. Notable protections include enterprise data protection (EDP), which is applied to prompts and responses in Microsoft 365 Copilot/Chat. This means the service commits to treating prompts/responses as protected data and not using them to train foundation models, as outlined in Microsoft 365 Copilot privacy & protections.
The use of Microsoft Purview DLP capabilities allows administrators to create policies that block sensitive content from being processed by Copilot, or label/exclude certain files so they aren’t summarized. These protections are further documented in “Microsoft Purview data security and compliance protections for generative AI apps”, which includes Copilot.
In terms of access permissions, Copilot only surfaces content to which the user has view permissions. Thus, if a user does not have access to a file, Copilot won’t automatically use it. This is mentioned in Copilot in Microsoft 365 Apps – your data & privacy.
For the Edge version, the user must explicitly enable “Allow access to page content” for Copilot to use the page context, providing an opt‑in boundary.
However, Microsoft does not publish full public documentation specifically labeled “prompt‑injection attack protection” or “data exfiltration safe mode” in the Copilot FAQ. Therefore, while some protections such as permissions, DLP, explicit page-context opt‑in, and logging are documented, full detail on how Microsoft detects or mitigates advanced prompt‑injection or malicious use (e.g., malicious document upload, hidden exfiltration patterns) is not publicly elaborated in consumer-facing documentation.
DLP (Data Loss Prevention): Microsoft supports Copilot via Microsoft Purview’s DLP capabilities for generative AI apps. For example, administrators can create policies that block or monitor Copilot prompts containing sensitive labels or prevent summarizing files labeled “Confidential.” The Microsoft 365 Copilot privacy documentation references Purview Information Protection and ensures usage rights are honored.
Compliance governance: Copilot activity, including prompts and responses, is treated like other Microsoft 365 content and is subject to retention, eDiscovery, and audit via Purview. Hidden mailbox folders and retention policies are applied to ensure proper governance.
Regulatory terms (e.g., HIPAA/BAA, GDPR/DPA): Microsoft states that Microsoft 365 Copilot meets data residency, DPA, and GDPR obligations. For instance, in the FAQ page, it’s confirmed that Microsoft 365 Copilot Chat falls under the terms of the Data Protection Addendum (DPA).
Regarding HIPAA/BAA, while not all features may be certified for HIPAA, Microsoft’s broader Microsoft 365 services support usage by HIPAA-covered organizations, and you should refer to your service agreement for more details.
As for copy/paste controls and screenshots, Microsoft does not publish a specific “screenshots control” section for Copilot (except for the Recall/PC feature, which is outside the core Copilot features).
Therefore, DLP/compliance is strongly supported in enterprise scenarios with capabilities for labels, retention, audit, and eDiscovery. However, consumer-oriented documentation does not provide a complete feature table for copy/paste controls or screenshot blocking in the Copilot context.
Microsoft states that Copilot only uses data the user has access to via Microsoft Graph, so internal apps and files behind SSO are subject to those permission controls. This ensures that access is restricted based on the permissions granted by the user’s credentials. For example, if a user is logged into a system with Single Sign-On (SSO), Copilot will only be able to access content that the user is permitted to view.
In the Edge version, if “Allow access to page content” is disabled (either by the user or admin), Copilot cannot access or ground to that page content. Thus, for heavily restricted contexts, such as SSO login pages, MFA prompts, or a VDI session, you could disable page content access to block Copilot from reading those pages.
Microsoft recommends blocking feature access via policy rather than relying on network filtering when managing high-security scenarios, which ensures that access restrictions are controlled centrally and consistently.
However, Microsoft does not publish a dedicated “VDI/SBO context” page for Copilot behavior in those environments, nor a comprehensive list of exactly what is blocked in such scenarios. Therefore, while you can restrict Copilot access via policy controls, Microsoft does not publicly document a full “restricted context list” for SSO/MFA/VDI with Copilot.
For reporting security issues or bugs, Microsoft directs security/vulnerability disclosures through the Microsoft Security Response Center (MSRC). While this is not specific to Copilot, it is the standard incident path for all Microsoft services, including Copilot.
For service incidents and SLAs, you can monitor service health in the Microsoft 365 admin centre. Microsoft provides an Online Services SLA, which applies to many services, including Copilot, as outlined in Service health in Microsoft 365.
Regarding rollback/kill-switch/disable, administrators have the ability to disable Copilot for the tenant, disable page content access, disable web search, unpin or restrict Copilot usage via policy. This provides an effective “kill-switch” from the admin side. For consumer accounts, users can disable Copilot in the account settings.
However, Microsoft does not publish a dedicated “incident path for Copilot only” page that describes SLAs specific to Copilot features, at least not publicly.
Microsoft’s documentation pages (e.g., Microsoft Learn articles) often include a “Last updated” or “Last reviewed” date at the bottom of the page. For example, the “Retention policies for Copilot” article shows a “Last updated” date of “26 September 2025,” as found in the Retention policies for Copilot & AI apps page.
For consumer Copilot terms of use (for individuals), Microsoft provides an archive of prior versions of the Terms of Use, so you can track changes over time. You can access this archive in the Copilot Terms of Use archive (individuals).
Microsoft does not publish a single “changelog of every Copilot behavior/policy” in one document, but you can track changes via the individual documentation pages and the version history of the terms/archives.