Microsoft is struggling to sell Copilot to corporations – employees want ChatGPT

3 CharlesW 5 6/26/2025, 4:05:40 PM techradar.com ↗

Comments (5)

toomuchtodo · 4h ago
Very similar to Salesforce shoving AI into Slack and raising prices with no option to not buy it. They’re attempting to force growth that is not forthcoming.

https://www.salesforce.com/news/stories/pricing-update-2025/

AI returns still a long way from justifying investment mania - https://news.ycombinator.com/item?id=44389144 - June 2025

(this is not to say AI has no value, but that there is a disconnect between the value prop narrative and the value discovery in progress through experimentation)

CharlesW · 4h ago
I'm sure you're right.

Also: In my experience, Office Copilot¹ is simply not a competitive offering. For several months I spent fractional time trying to get a simple — meaning, thoughtful and substantial prompting and knowledge, but no custom actions — project working on Office Copilot (using Copilot Studio) as well as it did as an OpenAI GPT, but never could.

Our IT Prevention Department even arranged a meeting with Microsoft folks at some point, and the Microsoft folks couldn't offer any useful direction. It was like pulling teeth to even get them to confirm which model Office Copilot Studio was using. It just doesn't feel like Microsoft is very serious about Office Copilot once you get beyond surface-level "write this Excel formula for me" use cases.

(¹ Calling it this to disambiguate from "GitHub Copilot".)

toomuchtodo · 4h ago
Similar experience talking to internal Microsoft folks on GRC issues related to Microsoft Copilot implementation details.
JohnFen · 4h ago
Even if I were interested in Copilot, that Microsoft is so energetic in terms of forcing it on me is incredibly off-putting. Not that Microsoft is the only one doing this, of course (look at Google shoving Gemini down our throats in Android), but so far they're the most obnoxious about it.
5555624 · 3h ago
While I don't really use either, my problem with Copilot is that it's censored. Oops, not censored; but, it does not display results that don't fit it's "ethical guidelines or safety filters" and I don't have a good feeling as to those constraints.

For example, "Who is Stormy Daniels?" A question one might have upon learning Donald Trump was convicted in the hush money scandal. Copilot: "I'd really like to help, but it seems this topic is off-limits for me. Sorry about that!" ChatGPT mentions she's in adult entertainment, some other background, and the Trump issue.