Documentation
•2 min read
Provider Configuration
How to configure Anthropic, OpenAI, and Ollama with AI Agent Flow.
Last updated2026-02-28
AI Agent Flow supports both cloud-based and local LLM providers. Configuration is handled via environment variables or a .aiagentflow config file in your home directory.
Pro Tip: For the most reliable autonomous engineering, we recommend Claude 3.5 Sonnet. It consistently outperforms other models in complex architectural reasoning.
Anthropic (Recommended)
Anthropic's Claude 3.5 Sonnet is the recommended model for architecture and coding tasks due to its superior reasoning capabilities.
Environment Variable:
export ANTHROPIC_API_KEY="your-api-key"
OpenAI
Compatible with GPT-4o and GPT-4-turbo.
Environment Variable:
export OPENAI_API_KEY="your-api-key"
Ollama (Local First)
For maximum privacy, you can run AI Agent Flow with local models using Ollama.
- Install Ollama
- Pull a supported model:
ollama pull llama3 - Configure AI Agent Flow:
export AI_PROVIDER="ollama" export OLLAMA_MODEL="llama3"
Setting the Default Provider
You can switch the default provider globally in your ~/.aiagentflow/config.json:
{
"default_provider": "anthropic",
"temperature": 0.2
}