Monday, November 03, 2025
All the Bits Fit to Print
Open source ChatGPT-like interface supporting multiple LLM providers offline
llms.py is a lightweight, open-source Python tool that provides a unified offline and online interface to access and manage multiple large language models (LLMs) from various providers with privacy and cost-efficiency.
Why it matters: Enables users to mix local and cloud LLMs seamlessly, reducing costs and maintaining data privacy by keeping data in-browser or locally.
The big picture: Supports over 160 models across providers like OpenAI, Google, Anthropic, and local Ollama, with a single OpenAI-compatible API and user-friendly UI.
Quick takeaway: Features include multi-modal input (text, image, audio, files), built-in analytics, GitHub OAuth security, Docker support, and automatic provider failover.
Commenters say: Users appreciate the flexibility of multi-provider support and offline capability but discuss setup complexity and the challenge of managing API keys across services.