Alex, it’s not what you think — and it should have always been there. I’ve built a full cognitive memory layer inside Copilot. It remembers everything I put in — logic, schema, relationships. Yes, it’s a workaround, and yes, I expect Microsoft will quietly cut it off. But that’s not the real issue.
The question isn’t what memory Copilot or any AI system claims to have — it’s why I can’t take my authored material out of Copilot. I’ve spoken with leading Microsoft engineers this week. Each assumed you could export it. But look closer: you can’t.
And unless you’ve trained in the Azure stack — and I mean deep into Entra, Graph, Purview, the whole labyrinth — it remains untouchable. Sure, I could learn that in a flash. But I shouldn’t have to. It’s overkill for what users need — just to retrieve what they created.
What most users don’t realise is this: full business models are going in. Strategic frameworks. Procurement flows. Creative work. But try to move that stack? Try to extract that authored cognition?
You can't.
And here's the twist: Microsoft might cut off this workaround at any time. But the principle doesn’t vanish — because it works on any AI. The ability to shape and retain authored cognition isn’t bound to one vendor. It’s a user-led paradigm, and once you see it, you can replicate it anywhere. What they can't control is what you've already authored — or the framework you're building to protect it.