'Copilot is for entertainment purposes only': Even Microsoft's official terms and conditions say you really shouldn't be using its AI at work

"Use Copilot at your own risk," Microsoft says

by · TechRadar

News By Craig Hale published 3 April 2026

(Image credit: Microsoft)

Share this article 0 Join the conversation Follow us Add us as a preferred source on Google Newsletter

Are you a pro? Subscribe to our newsletter

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Become a Member in Seconds

Unlock instant access to exclusive member features.

Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors


By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.

You are now subscribed

Your newsletter sign-up was successful


Join the club

Get full access to premium articles, exclusive features and a growing list of member rewards.

Explore


An account already exists for this email address, please log in. Subscribe to our newsletter


  • Microsoft has clarified some of the terms and conditions associated with Copilot
  • Responsibilities have been shifted onto the users for the AI tool
  • Despite being for "entertainment purposes," it's still heavily marketed toward workers

In a major twist of events, Microsoft has re-affirmed Copilot is for "entertainment purposes only" and that, if used for work, it should be used as the first of multiple stages of fact-checking, rather than being relied upon.

"It can make mistakes, and it may not work as intended," the company wrote. "Don’t rely on Copilot for important advice. Use Copilot at your own risk."

Though the company very much wants businesses and employees to continue using Copilot for work, there's a clear shift in responsibility to the user here, clearing Microsoft of any accusations of false information.

Article continues below

Microsoft says "use Copilot at your own risk"

In a roundabout way, Microsoft is effectively admitting to the risk of AI hallucination amid ongoing concerns about copyrighted content, IP ambiguity and output legitimacy.

With this in mind, the company clearly wants us to think of Copilot as a tool, not a decision-maker, and for users to independently fact-check outputs and be cautious with any sensitive, protected data.

"You agree to indemnify us and hold us harmless... from and against any claims, losses, and expenses... arising from or relating to your use of Copilot," Microsoft added in another paragraph.

More broadly, the company also notes that prompts and responses may be used to improve Copilot, however enterprise versions have additional protections to safeguard sensitive information. In other words, users retain the rights to their inputs, however Microsoft still has the right to use the data for improving the service.

Are you a pro? Subscribe to our newsletter

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Contact me with news and offers from other Future brandsReceive email from us on behalf of our trusted partners or sponsors