Show HN: Sllm.nvim – Integrate Simon’s LLM cli into Neovim (500 LOC Lua)

2 mozanunal 1 5/21/2025, 11:19:54 AM github.com ↗
Hi HN,

I built [sllm.nvim](https://github.com/mozanunal/sllm.nvim), a minimal (about 500 lines of Lua) Neovim plugin to bring Simon Willison’s excellent `llm` CLI directly into your coding workflow.

- Chat with LLMs (OpenAI, OpenRouter, etc.) in a split buffer. - Add files, visual selections, shell command outputs, LSP diagnostics, or URLs as fragments to your LLM context, all from Neovim. - Async streaming jobs — never block the editor. - Switch LLM models, see token/cost stats, and use keybindings for everything.

Inspired by Simon’s blog posts on long-context LLM workflows and managing context/fragments from the terminal — I wanted to make it seamless directly inside the editor. It uses [mini.nvim](https://github.com/echasnovski/mini.nvim) for the UI, but the core logic is just ~500 LOC.

Feedback/questions welcome! Thanks to Simon Willison & the llm community for all the inspiration.

Comments (1)

mozanunal · 7h ago
Why sllm.nvim?

The [`llm`](https://llm.datasette.io/en/stable/) command-line tool by Simon Willison (creator of Django, Datasette, and sqlite-utils) is a wonderfully extensible way to interact with Large Language Models. Its power lies in its simplicity and vast plugin ecosystem, allowing users to tap into numerous models directly from the terminal.

I was particularly inspired by Simon's explorations into `llm`'s [fragment features for long-context LLMs](https://simonwillison.net/2025/Apr/7/long-context-llm/). It struck me how beneficial it would be to seamlessly manage and enrich this context directly within Neovim, my primary development environment.

Like many developers, I found myself frequently switching to web UIs like ChatGPT, painstakingly copying and pasting code snippets, file contents, and error messages to provide the necessary context for the AI. This interruption broke my workflow and felt inefficient. `sllm.nvim` was born out of the desire to streamline this process. Contained within around 500 lines of Lua, it aims to be a simple yet powerful Neovim plugin. The heavy lifting of LLM interaction is delegated to the robust `llm` CLI. For the user interface components, I've chosen to leverage the excellent utilities from `mini.nvim` – a library I personally use for my own Neovim configuration – and plan to continue using its modules for any future UI enhancements. The focus of `sllm.nvim` is to orchestrate these components to manage LLM context and chat without ever leaving the editor.

As Simon Willison also discussed in his post on [using LLMs for code](https://simonwillison.net/2025/Mar/11/using-llms-for-code/), effective context management is key. `sllm.nvim` aims to significantly contribute to such a workflow by making context gathering and LLM interaction a native part of the Neovim experience.