Foxl Docs

AI Providers

Configure multiple AI providers - Anthropic, OpenAI, Google, Ollama, and AWS Bedrock

Foxl supports multiple AI providers. You can use Foxl credits through the cloud relay, or bring your own API keys for direct access.

Default: Foxl Relay

By default, Foxl routes AI requests through the cloud relay at relay.foxl.ai. This uses your Foxl credits and provides access to Claude models via AWS Bedrock.

No configuration needed - sign in and start chatting.

OAuth Providers (Subscription Login)

Use your existing Claude, Gemini, or ChatGPT subscriptions directly in Foxl - no API key needed. Foxl reads OAuth credentials created by each vendor's login flow.

OAuth providers are desktop-only. Authentication stays local - your subscription tokens never leave your machine except to reach the vendor's API. No Foxl credits consumed.

Unofficial community integration. Treat the auth file like a password - keep it on a trusted machine, don't share it, and don't pool access. Misuse may violate the provider's terms and result in rate limits or account suspension. Use at your own risk.

Gemini CLI (Google AI)

Use your Google AI subscription or API key.

  1. Install: npm install -g @google/gemini-cli
  2. Authenticate: gemini auth login or set GEMINI_API_KEY
  3. Select Gemini CLI in Settings > Providers

Models: Gemini 2.5 Pro, Gemini 2.5 Flash. Credentials from ~/.gemini/settings.json or environment variable.

Claude Code (Anthropic Pro/Max)

Use your Claude Pro or Claude Max subscription. Foxl calls api.anthropic.com directly with your Claude Code OAuth token - no CLI subprocess.

  1. Run the official login once inside Claude Code: claude /login (or claude setup-token)
  2. Select Claude Code (OAuth) in Settings > Providers, then pick Opus 4.7 or Sonnet 4.6 from the model selector

Models: Opus 4.7 and Sonnet 4.6 are exposed in compatibility mode so requests route through your Claude Pro/Max subscription instead of pay-as-you-go "Extra usage". Tokens auto-refresh in the background.

Compatibility mode disables Foxl-specific tools (memory, subagents, schedules, channel send, browser extension, view image). Only Bash, Read, Grep, and WebFetch are available while Claude Code OAuth is active. If you need the full Foxl tool surface, use an Anthropic API key (BYOK) or the foxl.ai relay instead. Haiku 4.5 is intentionally not exposed in this mode - it does not reliably stay on the subscription pool once more than a few tools are in play.

OpenAI (ChatGPT Plus/Pro)

Use your ChatGPT Plus or Pro subscription - Foxl calls OpenAI's Codex Responses API directly with your OAuth token. No local CLI subprocess or proxy server needed.

  1. Run the official login once: npx @openai/codex login (creates ~/.codex/auth.json)
  2. Select OpenAI (OAuth) in Settings > Providers

Models: GPT-5.5 (1M context), GPT-5.4 (1M context), GPT-5.4 Mini (400K), GPT-5.3 Codex (400K). The live catalog is fetched from Codex per account, so your actual selection may differ slightly based on what ChatGPT rolls out to your tier. Tokens auto-refresh in the background via the OAuth refresh flow.

Image generation is available through the generate_image tool (powered by gpt-image-2), which uses the same OAuth credentials. See Agent Tools - Media for details.

Bring Your Own Key (BYOK)

Add your own API keys to use models directly without consuming Foxl credits. Works on both desktop and web (app.foxl.ai).

Anthropic

Direct access to Claude models via the Anthropic API.

  1. Go to Settings > Providers
  2. Select Anthropic
  3. Enter your API key from console.anthropic.com
  4. Select a model (Claude Opus 4.7, Opus 4.6, Sonnet 4.6, Haiku 4.5)

OpenAI

Access GPT-4o, GPT-4, and other OpenAI models.

  1. Go to Settings > Providers
  2. Select OpenAI
  3. Enter your API key from platform.openai.com
  4. Select a model

Google (Gemini)

Access Gemini models via Google AI.

  1. Go to Settings > Providers
  2. Select Google
  3. Enter your API key from aistudio.google.com
  4. Select a Gemini model

Ollama (Local Models)

Run AI models entirely on your machine with zero cost and full privacy.

  1. Install Ollama from ollama.com
  2. Pull a model: ollama pull llama3 or ollama pull mistral
  3. In Foxl Settings > Providers, select Ollama
  4. Foxl auto-detects running Ollama models

Ollama models run entirely on your machine. No internet connection, no API calls, no credits consumed. Perfect for privacy-sensitive work.

AWS Bedrock

For users with their own AWS account and Bedrock access.

  1. Configure AWS credentials (~/.aws/credentials or environment variables)
  2. In Settings > Providers, select AWS Bedrock
  3. Ensure the models you want are enabled in your AWS Bedrock console

Provider Health Check

The Settings > Providers page shows the status of each configured provider:

  • Connected: Provider is reachable and API key is valid
  • Error: API key is invalid or provider is unreachable
  • Not configured: No API key entered

Model Switching

You can switch between models and providers at any time using the model selector in the chat input area. Your selection persists across conversations.

Cost Comparison

ProviderCostPrivacySpeed
Foxl RelayCredits (included in plan)Data routed through relayFast (AWS Bedrock)
Gemini CLIFree (subscription/API key)Local CLI to GoogleFast
OpenAI (OAuth)Free (ChatGPT Plus/Pro)Direct to chatgpt.com/backend-api/codex, tokens stay localFast
Anthropic BYOKPay-per-use to AnthropicDirect to AnthropicFast
OpenAI BYOKPay-per-use to OpenAIDirect to OpenAIFast
Google BYOKPay-per-use to GoogleDirect to GoogleFast
OllamaFree100% local, no networkDepends on hardware

On this page