GuidesDev Notebooks
Marimo Integration
Connect Marimo to LLM API for AI-powered capabilities
Marimo is a modern, reactive Python notebook that supports AI-powered code completion and assistance. It provides an OpenAI-compatible configuration for AI features.
Marimo's AI completion supports custom OpenAI-compatible endpoints.
Prerequisites
- An LLM API account with an API key
- Marimo installed or accessible
Setup
Get Your LLM API Key
- Log in to your LLM API dashboard
- Click Create Key to Start
- Copy your new API key immediately — it will only be shown once
- Store the key securely (e.g., in a password manager or
.envfile)
LLM API is an OpenAI-compatible gateway that gives you access to dozens of AI models through a single API key and endpoint.
Configure LLM API in Marimo
- Set environment variables:
export OPENAI_API_KEY="your-llm-api-key-here"
export OPENAI_BASE_URL="https://api.llmapi.ai/v1"- Or configure in marimo settings:
\# In marimo.toml or settings
[ai]
api_key = "your-llm-api-key-here"
base_url = "https://api.llmapi.ai/v1"
model = "openai/gpt-4o"- Restart Marimo --- AI completion will use LLM API.
Test the Integration
Verify that Marimo can successfully communicate with LLM API by sending a test request. All requests will now be routed through LLM API.
Marimo's reactive notebook + LLM API creates a powerful interactive AI development environment.
Benefits of Using LLM API with Marimo
- Multi-Provider Access: Use models from OpenAI, Anthropic, Google, and more through a single API
- Cost Control: Track and limit your AI spending with detailed usage analytics
- Unified Billing: One account for all providers instead of managing multiple API keys
- Caching: Reduce costs with response caching for repeated requests
View all available models on the models page.
How is this guide?