Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognis.vasanth.xyz/llms.txt

Use this file to discover all available pages before exploring further.

Client::from_env() reads a small set of environment variables to pick a provider, a key, and a default model. There are also a handful of runtime variables that affect behavior at a lower level. None are required if you build clients explicitly via Client::builder().

Provider selection

VariableRequiredNotes
COGNIS_PROVIDERyes (for from_env)One of openai, anthropic, google, ollama, azure, openrouter.

Per-provider credentials and defaults

Substitute <PROVIDER> with OPENAI, ANTHROPIC, GOOGLE, OLLAMA, AZURE, or OPENROUTER.
VariableDefaultNotes
COGNIS_<PROVIDER>_API_KEYProvider key. Not needed for Ollama.
COGNIS_<PROVIDER>_MODELprovider defaultDefault model name. Override per call via ChatOptions::model.
COGNIS_<PROVIDER>_BASE_URLprovider defaultOverride the API base. Useful for self-hosted or proxied backends.
Provider defaults vary; check the provider builder’s source for the exact value if you don’t override.

Azure-specific

VariableNotes
COGNIS_AZURE_API_KEYAzure OpenAI key.
COGNIS_AZURE_ENDPOINTE.g. https://your-resource.openai.azure.com.
COGNIS_AZURE_DEPLOYMENTThe deployment name (your custom alias).
COGNIS_AZURE_API_VERSIONE.g. 2024-08-06.
COGNIS_AZURE_MODELOptional model name (Azure typically uses deployment names).

Langfuse (cognis-trace, feature langfuse)

LangfuseExporter::from_env() and LangfuseConfig::from_env() read these. Not prefixed with COGNIS_ to match Langfuse’s conventions.
VariableRequiredNotes
LANGFUSE_PUBLIC_KEYyespk-lf-...
LANGFUSE_SECRET_KEYyessk-lf-...
LANGFUSE_HOSToptionalDefaults to https://cloud.langfuse.com. Override for self-hosted.

Examples

# OpenAI:
export COGNIS_PROVIDER=openai
export COGNIS_OPENAI_API_KEY=sk-...
export COGNIS_OPENAI_MODEL=gpt-4o-mini

# Anthropic:
export COGNIS_PROVIDER=anthropic
export COGNIS_ANTHROPIC_API_KEY=sk-ant-...

# Local Ollama:
export COGNIS_PROVIDER=ollama
export COGNIS_OLLAMA_MODEL=llama3.1
# COGNIS_OLLAMA_BASE_URL defaults to http://localhost:11434

# Azure:
export COGNIS_PROVIDER=azure
export COGNIS_AZURE_API_KEY=...
export COGNIS_AZURE_ENDPOINT=https://your-resource.openai.azure.com
export COGNIS_AZURE_DEPLOYMENT=gpt-4o-prod
export COGNIS_AZURE_API_VERSION=2024-08-06

# OpenRouter:
export COGNIS_PROVIDER=openrouter
export COGNIS_OPENROUTER_API_KEY=...
export COGNIS_OPENROUTER_MODEL=anthropic/claude-sonnet-4

Runtime variables

Cognis honors the standard tracing env vars for log filtering:
export RUST_LOG=cognis=info,cognis_trace=debug
There is no global verbosity flag; control it through tracing-subscriber’s filter directives.

What about .env files?

Cognis does not read .env files. Plain .env on disk is a security footgun (they ship to backups, logs, and screen-sharing surfaces). Recommended setup on macOS / Linux: direnv + envchain so the same shell-loaded vars are picked up automatically. See Installation → Set credentials. For CI: set the same variables in your runner’s secret store.

See also

Installation

The full credentials setup.

Models and providers

Builder shapes when env vars aren’t enough.