/GENAI QS/

Microsoft Copilot FAQs

AI Engine

Does Microsoft Copilot train on my chats by default?

As of 2025, it depends on whether you use a personal consumer account or an enterprise or organizational account. For enterprise and commercial users (such as organizational tenants), Microsoft confirms that prompts, responses, and data accessed through organizational services are not used to train the underlying foundation models without explicit permission. According to Microsoft Learn, “Prompts, responses, and data accessed through Microsoft Graph aren’t used to train foundation LLMs, including those used by Microsoft 365 Copilot.”

For consumer or personal users, Microsoft’s blog Transparency and Control in Consumer Data Use states that opt-out controls will be provided for consumer data used in training, and that model training will not begin until after consumer notification. The Privacy FAQ for Microsoft Copilot further clarifies that “We do not train Copilot on…” data from certain user categories, including organizational Entra ID accounts, specific regions, or users who have opted out.

In practice, organizational and enterprise data are excluded from model training by default unless an organization opts in. Consumer data follows an opt-out model, with personal identifiers automatically anonymized before any data is used for model training.

How long does Microsoft Copilot keep my chats/logs?

As of 2025, Microsoft’s documentation provides partial guidance- mainly for enterprise environments- but does not specify a single retention period for all Copilot contexts. The Learn about retention for Copilot and AI apps page explains that retention policies can be applied to messages generated in Copilot or other AI apps. These messages are stored in a hidden folder within the user’s mailbox, where a policy timer job evaluates items (typically within 1–7 days) and deletes them according to the organization’s retention policy.

For Copilot in Microsoft 365 apps for home (consumer use), Microsoft confirms that prompt and response activity history is stored and may be viewable, while such data is not used to train foundation models.

Overall, retention is configurable in enterprise settings through retention policies, consumer activity history is stored without a defined timeframe, and service logs may persist for up to 30 days. Microsoft does not define a universal retention duration across all Copilot implementations.

How do I opt out of model training and still keep history?

Yes,  you can stop your conversations from being used for model training while still keeping your chat history and access to the service. For enterprise users, this protection is enabled by default, while consumer users must actively opt out.

Microsoft’s privacy controls documentation states: “You can control if Microsoft can … use your interactions to train our generative AI models.” The Privacy FAQ for Microsoft Copilot adds that “If you opt out, that change will be reflected throughout our systems within 30 days.” The blog Transparency and Control in Consumer Data Use confirms that Microsoft provides clear, accessible options for consumers to opt out of training across Copilot and Bing experiences.

Opting out of model training does not affect stored conversation history, which is retained for 18 months by default, and users can delete conversations at any time. For organizational and enterprise users, data is not used for training unless the organization opts in, while consumer users must disable training through their settings. You can therefore opt out of training use and continue to retain your chat history, though available controls and retention periods vary by account type.

What’s different for Enterprise/Work/Gov vs personal plans?

There are distinct differences between enterprise and consumer plans in terms of privacy, data governance, and model training use. Enterprise and organizational tenants benefit from stronger privacy protections, compliance frameworks, and explicit non-training-use commitments, while personal and consumer plans operate under different conditions.

Microsoft’s Data Protection, Responsible AI blog states: “Your organization’s data is not used to train foundation models.” Likewise, the Microsoft 365 Copilot Privacy and Protections page confirms that “Prompts and responses aren’t used to train the underlying foundation models” in enterprise or tenant environments.

In contrast, for consumer or home users, the Copilot in Microsoft 365 apps for home: your data and privacy article explains that while prompts and responses in the “home” mode aren’t used for training, broader consumer Copilot usage may involve training unless users opt out. 

Overall, enterprise, work, and government users have data excluded from training by default and benefit from enterprise-level compliance, data residency, and governance controls. Personal and consumer users may have their data used for training unless they opt out and have fewer configurable data management options.

Where are retention settings and export/delete controls?

Microsoft provides retention and deletion controls primarily through admin or tenant-level policies for organizational users. For personal users, limited options exist to manage activity history and export data, though detailed instructions are not publicly available for every Copilot use case.

According to Learn about retention for Copilot and AI apps, enterprise administrators can apply retention policies to Copilot interactions stored as messages in Microsoft Purview or the Exchange hidden folder, allowing them to define retention and deletion timelines. For consumer or home users, the Copilot in Microsoft 365 apps for home documentation notes that users can export their “activity history” through the Microsoft Privacy Dashboard. Additionally, the Microsoft Copilot privacy controls article for personal accounts allows users to toggle consent for data use.

However, Microsoft’s documentation does not outline a universal, step-by-step retention or deletion schedule that applies across all Copilot variants—consumer, enterprise, and API, so certain aspects of retention and deletion remain only partially detailed publicly.

Is API usage treated differently from the app?

Yes,  usage through enterprise or tenant APIs and commercial-grade Copilot services is governed by stricter protections and defaults to non-training use, unlike consumer app usage, where training may occur unless the user opts out.

Microsoft’s Data Protection, Responsible AI blog states that generative AI solutions, including Copilots, do not use an organization’s data to train foundation models without permission. The Privacy FAQ for Copilot further confirms: “We do not train AI models on data … from users signed into Copilot with an organizational Entra ID account.”

For consumer accounts, Microsoft’s blogs and privacy control materials note that opt-out settings will be available for managing whether consumer data is used for training. In summary, API and enterprise services are treated as organizational use cases with default non-training use and stronger governance. In contrast, app and consumer services may allow training by default and require users to opt out, depending on the service context.