Ask HN: Is there any demand for Personal CV/Resume website?
2 points by usercvapp 9h ago 11 comments
Ask HN: Has anybody built search on top of Anna's Archive?
289 points by neonate 5d ago 146 comments
Intercepting LLM to transform every other token reveals surprising robustness
1 Shmungus 1 6/9/2025, 12:18:35 AM github.com ↗
How much disruption can different model architectures handle? Does token position matter more than token content for meaning preservation? Could this be used for real-time LLM steering or interpretability research?
Not sure if this is useful to anyone else, but it's been a fun way to poke at how these systems actually work under the hood. The streaming interception approach might have applications beyond just corruption experiments.