I've been developing a lot of LLM-based apps recently, and I got tired of writing the same glue code each time to switch providers, find the base URL again, or other strategies to deal with provider idiosyncracies, so I extracted out that code into a package named BorgLLM.
It's a zero-config Langchain client to 100s of providers & models, it has automatic API key rotation, rate-limit fallback strategies and more concerns you won't have to put in your application code...
One liner to rule them all (you can pick any model or any provider):
llm = create_llm("anthropic:claude-sonnet-4")
And my application code is finally clean and focusing on the domain again.
I hope you'll find it useful. Questions and feedback points are most welcome!
I've been developing a lot of LLM-based apps recently, and I got tired of writing the same glue code each time to switch providers, find the base URL again, or other strategies to deal with provider idiosyncracies, so I extracted out that code into a package named BorgLLM.
It's a zero-config Langchain client to 100s of providers & models, it has automatic API key rotation, rate-limit fallback strategies and more concerns you won't have to put in your application code...
One liner to rule them all (you can pick any model or any provider):
llm = create_llm("anthropic:claude-sonnet-4")
And my application code is finally clean and focusing on the domain again.
I hope you'll find it useful. Questions and feedback points are most welcome!