Installation
Installation
Section titled “Installation”Install LM Deluge using pip (Python 3.10+):
python -m pip install -U lm-delugeOptional extras:
pip install plyvelif you want LevelDB-backed local cachingpip install pdf2image pillowif you plan to turn PDFs into images viaImage.from_pdf
API Keys
Section titled “API Keys”LM Deluge reads API keys from environment variables so the client can contact each provider directly. Load them at process startup (for example with python-dotenv) and pass the values down to your workers, CLI scripts, or notebook kernels.
Required Environment Variables
Section titled “Required Environment Variables”Depending on which providers you plan to use, set the appropriate API keys:
# OpenAIOPENAI_API_KEY=sk-...
# AnthropicANTHROPIC_API_KEY=sk-ant-...
# Google AI Studio (Gemini)GEMINI_API_KEY=...
# CohereCOHERE_API_KEY=...
# OpenRouter (any model prefixed with openrouter:)OPENROUTER_API_KEY=...
# Meta, Groq, DeepSeek, etc. use provider-specific keys defined in src/lm_deluge/models
# AWS Bedrock (for Amazon-hosted Anthropic and Meta models)AWS_ACCESS_KEY_ID=...AWS_SECRET_ACCESS_KEY=....env File
Section titled “.env File”Create a .env file in your project root:
OPENAI_API_KEY=sk-...ANTHROPIC_API_KEY=sk-ant-...GEMINI_API_KEY=...Load the file inside your application:
import dotenv
dotenv.load_dotenv()Verification
Section titled “Verification”Verify your installation by running a simple test:
import dotenvfrom lm_deluge import LLMClient
dotenv.load_dotenv()
client = LLMClient("gpt-4.1-mini")responses = client.process_prompts_sync(["Say hello!"])print(responses[0].completion)If you see a response from the model, you’re all set!
Next Steps
Section titled “Next Steps”Head over to the Quick Start guide to learn how to use LM Deluge effectively.