Note: Copilot is currently only available in OneNote for Microsoft 365 on Windows, Mac, or iPad.
What is Copilot in OneNote?
Copilot in OneNote combines the power of large language models (LLMs) with your data to turn your notes into a powerful productivity tool. All of this is done within Microsoft's existing commitments to data security and privacy.
Copilot uses your prompts to draft plans, generate ideas, create lists, organize information, and more. Here are a few examples:
-
Create a plan for my daughter’s high school graduation party.
-
Summarize notes into bullet points.
-
Generate a list of topics and talking points to be covered in an annual investor update meeting.
-
Create a to-do list gathering calls to action from existing content.
-
Rewrite a paragraph/selected text.
Select a heading for more information.
Copilot in OneNote was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. More evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot in OneNote is continuously evaluated with user online feedback.
At Microsoft, we're guided by our AI principles, our Responsible AI Standard, and decades of research on AI, grounding, and privacy-preserving machine learning. A multidisciplinary team of researchers, engineers and policy experts reviews our AI systems for potential harms and mitigations — refining training data, filtering to limit harmful content, query- and result-blocking sensitive topics, and applying Microsoft technologies like InterpretML and Fairlearn to help detect and correct data bias. We make it clear how the system makes decisions by noting limitations, linking to sources, and prompting users to review, fact-check, and adjust content based on subject-matter expertise.
Depending on your scenario and input data, you could experience different levels of performance. While not limited specifically, Copilot in OneNote has limitations, which may include:
-
Potential inaccuracies and irrelevant suggestions.
-
Lack of context awareness beyond the provided code context.
-
Biases present in the training data that can influence the suggestions.
It is important to exercise judgment and review suggestions.
The responses that generative AI produces aren't guaranteed to be 100% factual. While we continue to improve responses, people should still use their judgment when reviewing the output before sending them to others. Our Copilot capabilities provide useful drafts and summaries to help you achieve more while giving you a chance to review the generated AI rather than fully automating these tasks.
We continue to improve algorithms to proactively address issues, such as misinformation and disinformation, content blocking, data safety, and preventing the promotion of harmful or discriminatory content in line with our responsible AI principles.
Copilot in OneNote is available in many languages. To learn more, see Supported languages for Microsoft Copilot. The quality is expected to be better when inputs are in English, while in other languages the quality is expected to be improved over time.
Copilot in OneNote has been reviewed by our Responsible AI (RAI) team. We follow RAI principles and have implemented:
-
Responsible AI handling pipeline to mitigate the risks like harmful, impropriate content.
-
In-product user feedback with which users can report offensive content back to Microsoft.
Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.
For more information about privacy, see the following information:
-
If you’re using Microsoft 365 Copilot in your organization (with your work or school account), see Data, Privacy, and Security for Microsoft 365 Copilot.
-
If you're using Copilot in Microsoft 365 apps for home as part of Copilot Pro (with your personal Microsoft account), see Copilot in Microsoft 365 apps for home: your data and privacy.
Learn more
Welcome to Copilot in OneNote