Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognis.vasanth.xyz/llms.txt

Use this file to discover all available pages before exploring further.

Prerequisites

  • Rust 1.75 or newer (workspace edition is 2021).
  • A tokio runtime — Cognis is async.

Pick an entry point

Most apps want the umbrella crate. It re-exports the foundation, LLM, RAG, and graph layers, so you write use cognis::prelude::*; and reach for what you need.

Pick features

External integrations are feature-gated so the core crates compile with no network code.
You want…Feature
OpenAI chat + tools + embeddingsopenai
Anthropicanthropic
Google Geminigoogle
Local Ollamaollama
Azure OpenAIazure
OpenRouter (with attribution headers)openrouter
Voyage embeddingsvoyage
Everything provider-shapedall-providers
FAISS local vector storecognis-rag/vectorstore-faiss
Hosted vector storescognis-rag/vectorstore-{chroma,qdrant,pinecone,weaviate}
PDF / YAML / TOML / CSV / HTML / web loaderscognis-rag/{pdf,yaml,toml,csv,html,web}-loader
SQLite-backed model cachecognis/cache-sqlite
HTTP request primitives for toolscognis/tools-http
Graph state in SQLite or Postgrescognis-graph/{sqlite,postgres}
Langfuse exportercognis-trace/langfuse
The full list lives in Reference → Feature flags.

Set credentials

Cognis never reads .env files. Plain .env files on disk are a security footgun. Use your shell, direnv, envchain, or your platform’s secret manager.
Client::from_env() reads:
VariablePurpose
COGNIS_PROVIDEROne of openai, anthropic, google, ollama, azure, openrouter.
COGNIS_<PROVIDER>_API_KEYThe provider’s key (Ollama: not needed).
COGNIS_<PROVIDER>_MODELOptional default model.
COGNIS_<PROVIDER>_BASE_URLOptional override for self-hosted backends.
COGNIS_AZURE_ENDPOINT, COGNIS_AZURE_DEPLOYMENT, COGNIS_AZURE_API_VERSIONAzure-specific.
Recommended setup on macOS / Linux: direnv + envchain.
.envrc (commit this)
if command -v envchain >/dev/null 2>&1; then
  vars=$(envchain --list myapp 2>/dev/null | paste -sd '|' -)
  if [ -n "$vars" ]; then
    eval "$(envchain myapp env | grep -E "^(${vars})=" | sed 's/^/export /')"
  fi
fi
one-time setup
envchain --set myapp COGNIS_PROVIDER          # e.g. openai
envchain --set myapp COGNIS_OPENAI_API_KEY
direnv allow
For CI, set the same variables in your runner’s secret store.

Verify

cargo check
If you set up the umbrella with openai, this should compile:
use cognis_llm::Client;

fn check() {
    let _ = Client::builder();
}

Workspace builds

Contributing to Cognis itself? Build and test the whole workspace:
cargo build --workspace
cargo build -p cognis --features all-providers
cargo test --workspace
cargo clippy --workspace --features all-providers -- -D warnings
See Contribute → Development setup for the full pre-push checklist.