Show HN: LLM-Exe – A Modular TypeScript Toolkit for LLM Application Development

1 llm-exe 0 5/5/2025, 1:26:38 AM llm-exe.com ↗
I've recently updated llm-exe, a modular TypeScript library built specifically to simplify creating applications with Large Language Models (LLMs). The package allows you to call LLM's from various providers without changing the underlying code.

The library focuses on a structured, lightweight, modular design that lets developers easily assemble complex workflows from reusable components:

Prompts: Structured templating for managing sophisticated prompts with built-in Handlebars support.

Parsers: Components that transform raw LLM responses (strings) into structured data, supporting JSON, arrays, enum extraction, and custom parsing logic.

LLM Providers: Abstracted interfaces for various providers, including OpenAI, Anthropic, xAI, Google Gemini, AWS Bedrock, and Ollama, allowing seamless switching without changing implementation logic.

Executors: The LLM executor takes an llm, a prompt, optionally a parser, and wraps in a well-typed function. An LLM executor is a container that can be used to call an LLM with a pre-defined input and output; with additional values provided at the time of execution. An LLM executor's input and output types are determined by the prompt and parser respectively.

Utilities: Helpers for common tasks such as prompt debugging, caching, and managing conversation state.

Here's how you might create a structured executor:

--- import { createLlmExecutor, createChatPrompt, createParser, useLlm } from 'llm-exe';

const llm = useLlm("openai.gpt-4o-mini");

const prompt = createChatPrompt<{input: string}>('Translate the phrase "{{input}}" into French, Spanish, and German, returning the results as an unordered markdown list.');

const parser = createParser('listToArray');

const translateExecutor = createLlmExecutor({ llm, prompt, parser });

// result is typed as string[]

// the input is also well-typed!

const result = await translateExecutor.execute({ input: "Hello, world!" });

console.log(result);

// Outputs: ["Bonjour le monde!", "¡Hola, mundo!", "Hallo, Welt!"] ---

Check out the documentation and more examples here: llm-exe.com. I'd love your feedback or contributions!

Comments (0)

No comments yet