LLM API
GuidesCoding Assistants

Sourcegraph Cody Integration

Connect Sourcegraph Cody to LLM API for AI-powered capabilities

Sourcegraph Cody is an enterprise AI coding assistant that understands your entire codebase. It provides context-aware code generation, explanations, and refactoring powered by large language models, with support for custom model providers through the site configuration.

Cody Enterprise supports custom OpenAI-compatible providers through provider overrides in the site config, letting you connect LLM API for your organization.

Prerequisites

  • An LLM API account with an API key
  • Sourcegraph Cody installed or accessible

Setup

Get Your LLM API Key

  1. Log in to your LLM API dashboard
  2. Click Create Key to Start
  3. Copy your new API key immediately — it will only be shown once
  4. Store the key securely (e.g., in a password manager or .env file)

LLM API is an OpenAI-compatible gateway that gives you access to dozens of AI models through a single API key and endpoint.

Configure LLM API in Sourcegraph Cody (Enterprise)

  1. Open your Sourcegraph instance's Site Configuration.
  2. Add a provider override in the modelConfiguration section:
"modelConfiguration": {
"providerOverrides": [{
"id": "llm-api",
"displayName": "LLM API",
"serverSideConfig": {
"type": "openaicompatible",
"endpoints": [{
"url": "https://api.llmapi.ai/v1/",
"accessToken": "your-llm-api-key-here"
}]
}
}]
}
  1. Add a model override to register specific models:
"modelOverrides": [{
"modelRef": "llm-api::v1::openai/gpt-4o",
"modelName": "openai/gpt-4o",
"displayName": "GPT-4o via LLM API",
"contextWindow": {
"maxInputTokens": 128000,
"maxOutputTokens": 4096
},
"capabilities": ["chat"]
}]
  1. Save the site configuration. Cody will now offer LLM API models to users.

Test the Integration

Verify that Sourcegraph Cody can successfully communicate with LLM API by sending a test request. All requests will now be routed through LLM API.

You can validate the configuration by checking the supported models endpoint: INSTANCE_URL/.api/modelconfig/supported-models.json

Benefits of Using LLM API with Sourcegraph Cody

  • Multi-Provider Access: Use models from OpenAI, Anthropic, Google, and more through a single API
  • Cost Control: Track and limit your AI spending with detailed usage analytics
  • Unified Billing: One account for all providers instead of managing multiple API keys
  • Caching: Reduce costs with response caching for repeated requests

View all available models on the models page.

How is this guide?