Microsoft labeled Copilot as for entertainment purpose only
Representative Image

Microsoft has been aggressively promoting its AI assistant “Copilot” to both businesses and individual consumers. However, when reviewing Microsoft’s Copilot Terms of Use, the question becomes how “seriously” consumers should be taking Copilot.

The Terms of Use contained a Warning

The Terms of Use for Microsoft’s Copilot state that the AI tool is designed strictly for “entertainment purposes only’ and therefore should not be relied upon for any “important decisions.” The Terms acknowledge that Copilot does make mistakes and will sometimes not operate as intended. As expected, this revelation rapidly gained traction within social media, given Microsoft’s current efforts to promote Copilot as an “essential” workplace productivity tool.

Microsoft claims it is “old language.”

As public demand has created increasing pressure on Microsoft to address this situation, the firm acted swiftly to stem the damage. A spokesperson for Microsoft told the press that the “problematic wording” in its documentation was old language that had been left in place for historical purposes, but does not accurately represent how the product works now. The spokesperson also confirmed that the language will be revised in future updates.

Nevertheless, critics pointed out that Microsoft only reacted after a vast amount of popular pressure prompted it to notice and act on a disclaimer that had long existed in its own product’s legal terms.

Habit throughout the industry

Microsoft is not alone in the AI community in trying to take a position on both sides of this issue. There are many other companies (OpenAI and xAI, among others) that use almost identical disclaimers in their legal documents, warning users to treat AI-generated answers as being less credible than human-sourced responses. This clearly shows a pattern of some form of contradiction at the centre of an increasing number of AI developers’ activities.