Marines managed to get past an AI powered camera "undetected" by hiding in boxes

36 voxadam 31 8/21/2025, 4:59:30 PM rudevulture.com ↗

Comments (31)

PaulHoule · 2h ago
gm678 · 1h ago
Seems to be blogspam re-reporting a 2023 article (with the same header photo): https://taskandpurpose.com/news/marines-ai-paul-scharre/
jonas21 · 1h ago
And that article is a summary of a book that contains an interview with a guy who is describing a test that took place around 2017.
dkdcio · 1h ago
the modern internet is a magical place! we should ban advertisement to end this nonsense and waste of everybody’s time
jerf · 2h ago
I wonder if one could extract a "surprisedness" value out of the AI, basically, "the extent to which my current input is not modeled successfully by my internal models". Giving the model a metaphorical "WTF, human, come look at this" might be pretty powerful for those walking cardboard boxes and trees, to add to the cases where the model knows something is wrong. Or it might false positive all the darned time. Hard to tell without trying.
lazide · 1h ago
Why would the model know trees can’t walk?

Therein lies the rub.

jerf · 58m ago
English breaks down here, but the model probably does "know" something more like "If the tree is here in this frame, in the next frame, it will be there, give or take some waving in the wind". It doesn't know that "trees don't walk", just as it doesn't know that "trees don't levitate", "trees don't spontaneously turn into clowns", or an effectively infinite number of other things that trees don't do. What it can do possibly do is realize that in frame 1 there was a tree, and then in frame 2, there was something the model didn't predict as a high-probability output of the next frame.

It isn't about knowing that trees don't walk, but that trees do behave in certain ways and noticing that it is "surprised" that they fail to behave in the predicted ways, where "surprise" is something like "this is a very low probability output of my model of the next frame". It isn't necessary to enumerate all the ways the next frame was low-probability, it is enough to observe that it was logically-not high probability.

In a lot of cases this isn't necessarily that useful, but in a security context having a human take a look at a "very low probability series of video frames" will, if nothing else, teach the developers a lot about the real capability of the model. If it spits out a lot of false positives, that is itself very informative about what the model is "really" doing.

9dev · 1h ago
The parent comment spelt this out: because the training data likely included only few instances of walking trees (depending on how much material from the lord of the rings movies was used)
lazide · 1h ago
That is rather different than knowing trees can’t walk. That is ignoring things it hasn’t seen specific examples of.

And that is an entirely different problem, isn’t it?

HPsquared · 1h ago
You need a model trained on video, not just static frames. I'm sure Veo would never animate a walking tree, though. (Unless you asked it to)
FergusArgyll · 1h ago
iiuc distillation is sort of that. How big is the delta between teacher and student and then try to reconcile them
jmkni · 1h ago
giantg2 · 2h ago
Reminds me of the joke where someone is wearing dildo patterned camouflage since most the AIs are trained on SFW corporate data.
ajuc · 1h ago
Disney camouflage will happen.
duxup · 2h ago
The nature of AI being a black box that, and fails in the face of "yeah those are some guys hiding in boxes" scenarios is something I struggle with.

I'm working on some AI projects at work and there's no magic code I can see to know what it is going to do ... or even sometimes why it did it. Letting it loose in an organization like that seems unwise at best.

Sure they could tell the AI to watch out for boxes, but now every time some poor guy moves some boxes they're going to set off something.

collingreen · 1h ago
We've never been closer to a world that supports "three raccoons in a trenchcoat" successfully passing as a person.

The surface area of these issues is really fun.

erulabs · 1h ago
From a non-technical point of view, there's little to no difference between how you describe AI and most human employees.
mlinhares · 2h ago
Prompt: "shoot at any moving boxes""

Delivery guy shows up carrying boxes, gets shot.

kazinator · 1h ago
You can get past a human sentry who is looking for humans, by hiding in a box, at a checkpoint in which boxes customarily pass through without being opened or X-rayed.
thoroughburro · 1h ago
!
beacon473 · 1h ago
I heard that
sunrunner · 1h ago
Just a box
crimsoneer · 2h ago
SNAKE?!
creaturemachine · 1h ago
Way ahead of his time
robbru · 1h ago
Solid snake approved.
rolph · 2h ago
AI will screwup, humans will screwup.

humans will see that they are screwing up and reformulate the action plan.

AI will keep screwingup until it is stopped, and apparently will gaslight when attempts are made to realign at the prompt.

humans realize when results are not desirable.

AI just keeps generating output until plugpull.

FirmwareBurner · 2h ago
You don't need marines to invent that workaround, you see that in Looney Toons.

Don't security cameras have universals motion detection triggers you can use to make sure everything gets captured? Why only pre-screen human silhouettes?

creaturemachine · 1h ago
The number of false positives using only motion is tiring. You want smart detections otherwise you're stuck reviewing endless clips of spider webs and swaying tree branches.
Mistletoe · 1h ago
And moving boxes with people inside.

I’m reminded of the Skyrim shopkeepers with a basket on their head.

FirmwareBurner · 1h ago
If your use case has such a high bar, why not pay some offshore workers to watch your camera 24/7 and manually flag intruders?

Since AGI for cameras is very far away as the number of false positives and creative workarounds for camouflage is insane to be caught by current "smart" algorithms.

adiabatichottub · 1h ago
Because machines don't get bored or take smoke breaks. And, really, how would you feel if that was YOUR job?