How to get value from M365 Copilot
- Tom Hansen
- Jul 25, 2025
- 2 min read
Updated: Jul 27, 2025

When I ask Copilot something at work, the answer sometimes sounds like it comes from my six-year-old. But the same prompt to ChatGPT, Gemini or Claude? Suddenly I’m talking to a Ph.D. candidate.
And no, it’s not because Copilot isn’t smart. It’s because it’s wearing a very tight corporate suit. Microsoft’s wrapped it in a heavy layer of rules, controls and compliance locks. That’s smart from a data protection point of view, but let’s be honest, it also makes Copilot feel like a blunt pencil in a room full of fine liners.
So, what happens? People try it, get disappointed, and then open a private tab with their favorite public AI instead. That quiet little move has a name: Shadow AI. And it’s not just “a thing” anymore, it’s the norm. In Denmark, we’re already at the halfway mark. Half of all knowledge workers are doing this under the radar.
The problem? The moment that conversation includes company-sensitive info, the risk of serious data exposure shoots through the roof.
That’s why I wrote this guide. Not to blame anyone, but to offer a smarter path. Because let’s face it, if the tools aren’t working the way we need them to, we’ll work around them. But there’s another way.
The guide walks through a hybrid workflow that’s both smart and secure. You start with a public AI for idea generation or structure, and then shift to Copilot when things get real. Not in theory. Not in buzzwords. In real prompts, examples and transitions you can actually use.
And the whole thing is built on Danish leadership challenges, not abstract tech dreams.
If you’ve ever felt stuck between quality and compliance, this might just be your bridge. And if you’ve wrestled with the same dilemma, I’d love to hear how you handled it. We’re all figuring this out in real time.
Here's your guide:


