GuidesLocal & Self Hosted LLMs
KoboldAI Integration
Connect KoboldAI to LLM API for AI-powered capabilities
KoboldAI is an open-source browser-based front-end for text generation AI models. It's popular among creative writers for story generation, roleplay, and interactive fiction, with support for both local and remote models.
KoboldAI's OpenAI-compatible mode lets you connect remote endpoints like LLM API.
Prerequisites
- An LLM API account with an API key
- KoboldAI installed or accessible
Setup
Get Your LLM API Key
- Log in to your LLM API dashboard
- Click Create Key to Start
- Copy your new API key immediately — it will only be shown once
- Store the key securely (e.g., in a password manager or
.envfile)
LLM API is an OpenAI-compatible gateway that gives you access to dozens of AI models through a single API key and endpoint.
Configure LLM API in KoboldAI
- Open KoboldAI in your browser.
- Click "AI" in the top menu and select "Connect to OpenAI-compatible API".
- Enter:
- API Endpoint: https://api.llmapi.ai/v1
- API Key: paste the key you copied from app.llmapi.ai/api-keys
- Select the model ID.
- Click "Connect".
Tip: KoboldAI's advanced generation settings (temperature, top-p, etc.) work with all LLM API models.Test the Integration
Verify that KoboldAI can successfully communicate with LLM API by sending a test request. All requests will now be routed through LLM API.
Benefits of Using LLM API with KoboldAI
- Multi-Provider Access: Use models from OpenAI, Anthropic, Google, and more through a single API
- Cost Control: Track and limit your AI spending with detailed usage analytics
- Unified Billing: One account for all providers instead of managing multiple API keys
- Caching: Reduce costs with response caching for repeated requests
View all available models on the models page.
How is this guide?