Environment Configuration
Forge can be configured through environment variables to control its behavior, API connections, and model preferences. This page describes the available configuration options and how to use them.
Configuration Methods
You can configure Forge using either:
- Environment Variables: Set directly in your shell profile (
.bashrc
,.zshrc
, etc.) .env
File: Create a file named.env
in your home directory
The .env
file method is recommended for most users as it keeps your configuration in one place and prevents exposing API keys in your shell history.
API Provider Configuration
API Keys
Forge supports multiple AI providers and checks for API keys in the following priority order:
Environment Variable | Provider |
---|---|
FORGE_KEY | Antinomy's provider (OpenAI-compatible) |
OPENROUTER_API_KEY | Open Router (aggregates multiple models) |
OPENAI_API_KEY | Official OpenAI |
ANTHROPIC_API_KEY | Official Anthropic |
Example configuration in .env
file:
# For Open Router (recommended, provides access to multiple models)
OPENROUTER_API_KEY=your_openrouter_key_here
# For official OpenAI
# OPENAI_API_KEY=your_openai_key_here
# For official Anthropic
# ANTHROPIC_API_KEY=your_anthropic_key_here
# For Antinomy's provider
# FORGE_KEY=your_forge_key_here
Custom Provider URLs
For OpenAI-compatible providers (including Open Router), you can customize the API endpoint URL:
# Custom OpenAI-compatible provider
OPENAI_API_KEY=your_api_key_here
OPENAI_URL=https://your-custom-provider.com/v1
# Or with Open Router but custom endpoint
OPENROUTER_API_KEY=your_openrouter_key_here
OPENAI_URL=https://alternative-openrouter-endpoint.com/v1
This is useful for:
- Self-hosted models with OpenAI-compatible APIs
- Enterprise OpenAI deployments
- Proxy services or API gateways
- Regional API endpoints