LLM API
GuidesChat Interfaces

ChatHub Integration

Connect ChatHub to LLM API for AI-powered capabilities

ChatHub is a browser extension that lets you chat with multiple AI models simultaneously in a split-screen interface. It supports ChatGPT, Claude, Gemini, and custom OpenAI-compatible endpoints.

ChatHub's custom provider feature lets you add LLM API as an additional chat source.

Prerequisites

  • An LLM API account with an API key
  • ChatHub installed or accessible

Setup

Get Your LLM API Key

  1. Log in to your LLM API dashboard
  2. Click Create Key to Start
  3. Copy your new API key immediately — it will only be shown once
  4. Store the key securely (e.g., in a password manager or .env file)

LLM API is an OpenAI-compatible gateway that gives you access to dozens of AI models through a single API key and endpoint.

Configure LLM API in ChatHub

  1. Install the ChatHub browser extension.
  2. Open ChatHub settings.
  3. Navigate to "Custom API" or "OpenAI Compatible" section.
  4. Enter:
  5. Add the model ID.
  6. Click "Save".

Test the Integration

Verify that ChatHub can successfully communicate with LLM API by sending a test request. All requests will now be routed through LLM API.

ChatHub's split-screen lets you compare LLM API responses with other providers in real time.

Benefits of Using LLM API with ChatHub

  • Multi-Provider Access: Use models from OpenAI, Anthropic, Google, and more through a single API
  • Cost Control: Track and limit your AI spending with detailed usage analytics
  • Unified Billing: One account for all providers instead of managing multiple API keys
  • Caching: Reduce costs with response caching for repeated requests

View all available models on the models page.

How is this guide?