Copilot broke audit logs, but Microsoft won't tell customers

201 Sayrus 48 8/20/2025, 12:18:00 AM pistachioapp.com ↗

Comments (48)

TheRoque · 15m ago
In my opinion, using AI tools for programming at the moment, unless in a sandboxed environment and on a toy project, is just ludicrous. The amount of shady things going on in this domain (AI trained on stolen content, no proper attribution, not proper way to audit what's going out to third party servers etc.) should be a huge red flag for any professional developer.
neuroelectron · 9m ago
The icing on the shit cake is a text editor programmed in typeScript with an impossible to secure plugin architecture.
lokar · 1h ago
Wait, copilot operates as some privileged user (that can bypass audit?), not as you (or better, you with some restrictions)

That can’t be right, can it?

catmanjan · 1h ago
As someone else mentioned the file isnt actually accessed by copilot, rather copilot is reading the pre-indexed contents of the file in a search engine...

Really Microsoft should be auditing the search that copilot executes, its actually a bit misleading to be auditing the file as accessed when copilot has only read the indexed content of the file, I don't say I've visited a website when I've found a result of it in Google

No comments yet

tomrod · 1h ago
Sure sounds like, for Microsoft, an audit log is optional when it comes to cramming garbage AI integrations in places they don't belong.
ceejayoz · 1h ago
dhosek · 33m ago
That was a laugh-out-loud moment in that film.
lokar · 1h ago
lol. I’ve avoided MS my entire (30+ year) career. Every now and then I’m reminded I made the right choice.
tomrod · 1h ago
Brilliant.
faangguyindia · 20m ago
I've disabled copilot i don't even find it useful. I think most people who use copilot have not see "better".
jjkaczor · 1h ago
So... basically like when Delve was first introduced and was improperly security trimming things it was suggesting and search results.

... Or ... a very long-time ago, when SharePoint search would display results and synopsis's for search terms where a user couldn't open the document, but could see that it existed and could get a matching paragraph or two... Best example I would tell people of the problem was users searching for things like: "Fall 2025 layoffs"... if the document existed, then things were being planned...

Ah Microsoft, security-last is still the thing, eh?

ocdtrekkie · 37m ago
I would say "insecure by default".

I talked to some Microsoft folks around the Windows Server 2025 launch, where they claimed they would be breaking more compatibility in the name of their Secure Future Initiative.

But Server 2025 will load malicious ads on the Edge start screen[1] if you need to access a web interface of an internal thing from your domain controller, and they gleefully announced including winget, a wondeful malware delivery tool with zero vetting or accountability in Server 2025.

Their response to both points was I could disable those if I wanted to. Which I can, but was definitely not the point. You can make a secure environment based on Microsoft technologies, but it will fight you every step of the way.

[1] As a fun fact, this actually makes Internet Explorer a drastically safer browser than Edge on servers! By default, IE's ESC mode on servers basically refused to load any outside websites.

Spooky23 · 1h ago
No, it accesses data with the users privilege.
gpm · 55m ago
Are you telling me I, a normal unprivileged user, have a way to read files on windows that bypasses audit logs?
lokar · 41m ago
I'm guessing they are making an implicit distinction between access as the user, vs with the privs of the user.

In the second case, the process has permission to do whatever it wants, it elects to restrain itself. Which is obviously subject to many more bugs then the first approach.

jeanlucas · 1h ago
A better title would be: Microsoft Copilot isn't HIPAA compliant

A title like this will get it fixed faster.

rst · 1h ago
It already is fixed -- the complaint is that customers haven't been notified.
nzeid · 1h ago
Hard to count the number of things that can go wrong by relying directly on an LLM to manage audit/activity/etc. logs.

What was their bug fix? Shadow prompts?

jsnell · 1h ago
> Hard to count the number of things that can go wrong by relying directly on an LLM to manage audit/activity/etc. logs.

Nothing in this post suggests that they're relying on the LLM itself to append to the audit logs. That would be a preposterous design. It seems far more likely the audit logs are being written by the scaffolding, not by the LLM, but they instrumented the wrong places. (I.e. emitting on a link or maybe a link preview being output, rather than e.g. on the document being fed to the LLM as a result of RAG or a tool call.)

(Writing the audit logs in the scaffolding is probably also the wrong design, but at least it's just a bad design rather than a totally absurd one.)

nzeid · 1h ago
Heard, but since the content or its metadata must be surfaced by the LLM, what's the fix?
nzeid · 1h ago
Thinking about this a bit - you'd have to isolate any interaction the LLM has with any content to some sort of middle end that can audit the LLM itself. I'm a bit out of my depth here, though. I don't know what Microsoft does or doesn't do with Copilot.
verandaguy · 1h ago
I'm very sceptical of using shadow prompts (or prompts of any kind) as an actual security/compliance control or enforcement mechanism. These things should be done using a deterministic system.
ath3nd · 1h ago
I bet you are a fan of OpenAI's groundbreaking study mode feature.
gpm · 1h ago
I'd hope that if a tool the LLM uses reveals any part of the file to the LLM it counts as a read by every user who sees any part of the output that occurred after that revelation was added to the context.
downrightmike · 56m ago
Shadow copies
troad · 35m ago
Microsoft's ham-fisted strategy for trying to build a moat around its AI offering, by shoving everyone's documents in it without any real informed consent, genuinely beggars belief.

It will not successfully create a moat - turns out files are portable - but it will successfully peeve a huge number of users and institutions off, and inevitably cause years of litigation and regulatory attention.

Are there no adults left at Microsoft? Or is it now just Copilot all the way up?

jayofdoom · 1h ago
Generally speaking, anyone can file a CVE. Go file one yourself and force their response. This blogpost puts forth reasonably compelling evidence.
aspenmayer · 1h ago
It’s true. The form is right here. When they support PGP, I suspect they know what they’re doing and why, and have probably been continuously doing so for longer than I have been alive. Just look at their sponsors and partners.

https://cveform.mitre.org/

Please only use this for legitimate submissions.

db48x · 1h ago
Fun, but it doesn’t deserve a CVE. CVEs are for vulnerabilities that are common across multiple products from multiple sources. Think of a vulnerability in a shared library that is used in most Linux distributions, or is statically linked into multiple programs. Copilot doesn’t meet that criteria.

Honestly, the worst thing about this story is that apparently the Copilot LLM is given the instructions to create audit log entries. That’s the worst design I could imagine! When they use an API to access a file or a url then the API should create the audit log. This is just engineering 101.

gpm · 1h ago
Huh, there are CVEs for windows components all the time, random example: https://msrc.microsoft.com/update-guide/vulnerability/CVE-20...

Including for end user applications, not libraries, another random example: https://msrc.microsoft.com/update-guide/vulnerability/CVE-20...

ecb_penguin · 59m ago
> CVEs are for vulnerabilities that are common across multiple products from multiple sources.

This is absolutely not true. I have no idea where you came up with this.

> Honestly, the worst thing about this story is that apparently the Copilot LLM is given the instructions to create audit log entries.

That's not at all what the article says.

> That’s the worst design I could imagine!

Ok, well, that's not how they designed it.

> This is just engineering 101.

Where is the class for reading 101?

QuadmasterXLII · 1h ago
This seems like a five alarm fire for HIPPA, is there something I’m missing?
Spooky23 · 1h ago
It’s a bug. He reported it, they fixed it.

It is not a five alarm fire for HIPAA. HIPAA doesn’t require that all file access be logged at all. HIPAA also doesn’t require that a CVE be created for each defect in a product.

End of the day, it’s a hand-wavy, “look at me” security blog. Don’t get too crazy.

loeg · 1h ago
It's HIPAA.
adzm · 1h ago
The HIPAA hippo certainly encourages this confusion
ivewonyoung · 1h ago
It's HIPPA now for all intensive purposes.
heywire · 1h ago
I am so tired of Microsoft cramming Copilot into everything. Search at $dayjob is completely borked right now. It shows a page of results, but the immediately pops up some warning dialog you cannot dismiss that Copilot can’t access some file “” or something. Every VSCode update I feel like I have to turn off Copilot in some new way. And now apparently it’ll be added to Excel as well. Thankfully I don’t have to use anything from Microsoft after work hours.
troad · 18m ago
> Every VSCode update I feel like I have to turn off Copilot in some new way.

This has genuinely made me work on switching to neovim. I previously demurred because I don't trust supply chains that are random public git repos full of emojis and Discords, but we've reached the point now where they're no less trustworthy than Microsoft. (And realistically, if you use any extensions on VS Code you're already trusting random repos, so you might as well cut out the middle man with an AI + spyware addiction and difficulties understanding consent.)

TheRoque · 12m ago
Same. Actually made me switch to neovim more and more. It's a great time to do so, with the new native package manager (now working in nightly 0.12)
candiddevmike · 1h ago
RE: VSCode copilot, you're not crazy, I'm seeing it too. And across multiple machines, even with settings sync enabled, I have to periodically go on each one and uninstall the copilot extension _again_. I'll notice the Add to chat... in the right click context menu and immediately know it got reinstalled somehow.

I'd switch to VSCodium but I use the WSL and SSH extensions :(

userbinator · 1h ago
Thankfully I don’t have to use anything from Microsoft after work hours.

There are employers where you don't have to use anything from Microsoft during work hours either.

keyle · 1h ago
Everything except the best thing they could have brought back: Clippy! </3
xet7 · 1h ago
Josh5 · 1h ago
are they even sure that the AI even accessed the content that second time? LLMs are really good and making up shit. I have tested this by asking various LLMs to scrape data from my websites while watching access logs. Many times, they don't and just rely on some sort of existing data or spout a bunch of BS. Gemini is especially bad like this. I have not used copilot myself, but my experience with other AI makes me curious about this.
bongodongobob · 1h ago
This is it. M365 uses RAG on your enterprise data that you allow it to access. It's not actually accessing the files directly in the cases he provided. It's working as intended.
crooked-v · 1h ago
If that's the case, then as noted in the article, the 'as intended' is probably violating liability requirements around various things.
thenaturalist · 1h ago
Hardly have I ever seen corporate incentives so aligned to overhype the capabilities of a technology while it being so raw and unpolished as this one.

The bubble bursting will be epic.

micromacrofoot · 1h ago
AI induced hysteria is probably wider spread than initially thought, these people are absolutely insane