AI Providers
Configure multiple AI providers - Anthropic, OpenAI, Google, Ollama, and AWS Bedrock
Foxl supports multiple AI providers. You can use Foxl credits through the cloud relay, or bring your own API keys for direct access.
Default: Foxl Relay
By default, Foxl routes AI requests through the cloud relay at relay.foxl.ai. This uses your Foxl credits and provides access to Claude models via AWS Bedrock.
No configuration needed - sign in and start chatting.
OAuth Providers (Subscription Login)
Use your existing Claude, Gemini, or ChatGPT subscriptions directly in Foxl - no API key needed. Foxl reads OAuth credentials created by each vendor's login flow.
OAuth providers are desktop-only. Authentication stays local - your subscription tokens never leave your machine except to reach the vendor's API. No Foxl credits consumed.
Unofficial community integration. Treat the auth file like a password - keep it on a trusted machine, don't share it, and don't pool access. Misuse may violate the provider's terms and result in rate limits or account suspension. Use at your own risk.
Gemini CLI (Google AI)
Use your Google AI subscription or API key.
- Install:
npm install -g @google/gemini-cli - Authenticate:
gemini auth loginor setGEMINI_API_KEY - Select Gemini CLI in Settings > Providers
Models: Gemini 2.5 Pro, Gemini 2.5 Flash. Credentials from ~/.gemini/settings.json or environment variable.
Claude Code (Anthropic Pro/Max)
Use your Claude Pro or Claude Max subscription. Foxl calls api.anthropic.com directly with your Claude Code OAuth token - no CLI subprocess.
- Run the official login once inside Claude Code:
claude /login(orclaude setup-token) - Select Claude Code (OAuth) in Settings > Providers, then pick Opus 4.7 or Sonnet 4.6 from the model selector
Models: Opus 4.7 and Sonnet 4.6 are exposed in compatibility mode so requests route through your Claude Pro/Max subscription instead of pay-as-you-go "Extra usage". Tokens auto-refresh in the background.
Compatibility mode disables Foxl-specific tools (memory, subagents, schedules, channel send, browser extension, view image). Only Bash, Read, Grep, and WebFetch are available while Claude Code OAuth is active. If you need the full Foxl tool surface, use an Anthropic API key (BYOK) or the foxl.ai relay instead. Haiku 4.5 is intentionally not exposed in this mode - it does not reliably stay on the subscription pool once more than a few tools are in play.
OpenAI (ChatGPT Plus/Pro)
Use your ChatGPT Plus or Pro subscription - Foxl calls OpenAI's Codex Responses API directly with your OAuth token. No local CLI subprocess or proxy server needed.
- Run the official login once:
npx @openai/codex login(creates~/.codex/auth.json) - Select OpenAI (OAuth) in Settings > Providers
Models: GPT-5.5 (1M context), GPT-5.4 (1M context), GPT-5.4 Mini (400K), GPT-5.3 Codex (400K). The live catalog is fetched from Codex per account, so your actual selection may differ slightly based on what ChatGPT rolls out to your tier. Tokens auto-refresh in the background via the OAuth refresh flow.
Image generation is available through the generate_image tool (powered by gpt-image-2), which uses the same OAuth credentials. See Agent Tools - Media for details.
Bring Your Own Key (BYOK)
Add your own API keys to use models directly without consuming Foxl credits. Works on both desktop and web (app.foxl.ai).
Anthropic
Direct access to Claude models via the Anthropic API.
- Go to Settings > Providers
- Select Anthropic
- Enter your API key from console.anthropic.com
- Select a model (Claude Opus 4.7, Opus 4.6, Sonnet 4.6, Haiku 4.5)
OpenAI
Access GPT-4o, GPT-4, and other OpenAI models.
- Go to Settings > Providers
- Select OpenAI
- Enter your API key from platform.openai.com
- Select a model
Google (Gemini)
Access Gemini models via Google AI.
- Go to Settings > Providers
- Select Google
- Enter your API key from aistudio.google.com
- Select a Gemini model
Ollama (Local Models)
Run AI models entirely on your machine with zero cost and full privacy.
- Install Ollama from ollama.com
- Pull a model:
ollama pull llama3orollama pull mistral - In Foxl Settings > Providers, select Ollama
- Foxl auto-detects running Ollama models
Ollama models run entirely on your machine. No internet connection, no API calls, no credits consumed. Perfect for privacy-sensitive work.
AWS Bedrock
For users with their own AWS account and Bedrock access.
- Configure AWS credentials (
~/.aws/credentialsor environment variables) - In Settings > Providers, select AWS Bedrock
- Ensure the models you want are enabled in your AWS Bedrock console
Provider Health Check
The Settings > Providers page shows the status of each configured provider:
- Connected: Provider is reachable and API key is valid
- Error: API key is invalid or provider is unreachable
- Not configured: No API key entered
Model Switching
You can switch between models and providers at any time using the model selector in the chat input area. Your selection persists across conversations.
Cost Comparison
| Provider | Cost | Privacy | Speed |
|---|---|---|---|
| Foxl Relay | Credits (included in plan) | Data routed through relay | Fast (AWS Bedrock) |
| Gemini CLI | Free (subscription/API key) | Local CLI to Google | Fast |
| OpenAI (OAuth) | Free (ChatGPT Plus/Pro) | Direct to chatgpt.com/backend-api/codex, tokens stay local | Fast |
| Anthropic BYOK | Pay-per-use to Anthropic | Direct to Anthropic | Fast |
| OpenAI BYOK | Pay-per-use to OpenAI | Direct to OpenAI | Fast |
| Google BYOK | Pay-per-use to Google | Direct to Google | Fast |
| Ollama | Free | 100% local, no network | Depends on hardware |