Focus and Context and LLMs

27 tarasglek 7 6/8/2025, 9:09:19 AM taras.glek.net ↗

Comments (7)

artembugara · 29m ago
What are some startups that help precisely with “feeding the LLM the right context” ?
__mharrison__ · 1h ago
Building complex software is certainly possible with no coding and minimal promoting.

This YT video (from 2 days ago) demonstrates it https://youtu.be/fQL1A4WkuJk?si=7alp3O7uCHY7JB16

The author builds a drawing app in an hour.

summarity · 1h ago
I found the same in my personal work. I have o3 chats (as in OAI's Chat interface) that are so large they crash the site, yet o3 still responds without hallucination and can debug across 5k+ LOC. I've used it for DSP code, to debug a subtle error in a 800+LOC Nim macro that sat in a 4k+ LOC module (it found the bug), work on compute shaders for audio analysis, work on optimizing graphics programs and other algorithms. Once I "vibe coded" (I hate that term) a fun demo using a color management lib I wrote, which encoded the tape state for a brainfuck interpreter in the deltaE differences between adjacent cells. Using the same prompts replayed in Claude chat and others doesn't even get close. It's spooky.

Yet when I use the Codex CLI, or agent mode in any IDE it feels like o3 regresses to below GPT-3.5 performance. All recent agent-mode models seem completely overfitted to tool calling. The most laughable attempt is Mistral's devstral-small - allegedly the #1 agent model, but going outside of scenarios you'd encounter in SWEbench & co it completely falls apart.

I notice this at work as well, the more tools you give any model (reasoning or not), the more confused it gets. But the alternative is to stuff massive context into the prompts, and that has no ROI. There's a fine line to be walked here, but no one is even close it yet.

emorning3 · 1h ago
The article summed itself up as 'Context is everything".

But the article itself also makes the point that a human assistant was also necessary. That's gonna be my take away.

quantum_state · 4h ago
Context is all you need :-)
tarasglek · 4h ago
Indeed, that was my original working title
max2he · 1h ago
bruh that's googles original working title