Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognis.vasanth.xyz/llms.txt

Use this file to discover all available pages before exploring further.

Cognis is feature-light by default. Every external integration — providers, vector stores, checkpoint backends, observability exporters — is feature-gated so the core crates compile with no network deps. Pick what you need; pay for nothing else.

cognis (umbrella)

FeaturePulls in
openai (default)OpenAI provider + embeddings
ollama (default)Ollama provider + embeddings
anthropicAnthropic Messages
googleGemini chat + embeddings
azureAzure OpenAI
voyageVoyage embeddings
all-providersEvery provider above
cache-sqliteSQLite-backed model cache
tools-httpHTTP request primitives for tools
cargo add cognis --features anthropic,google,vectorstore-faiss
cargo add cognis --features all-providers

cognis-core, cognis-macros

These crates compile with no network features ever. Anything that does I/O lives in a sibling crate.

cognis-llm

FeatureEffect
openai (default)reqwest + OpenAI client
ollama (default)reqwest + Ollama client (no key)
anthropicreqwest + Anthropic client
googlereqwest + Gemini client
azurereqwest + Azure OpenAI client
all-providersAll of the above
OpenRouter uses the OpenAI client with Provider::OpenRouter; no separate feature.

cognis-rag

Embeddings

FeatureProvider
openai (default)OpenAI text-embedding-3-*
ollama (default)Local Ollama embeddings
googleGoogle text-embedding-004, gemini-embedding
voyageVoyage AI

Vector stores

FeatureBackend
vectorstore-faissLocal FAISS index (no HTTP)
vectorstore-chromaChroma server
vectorstore-qdrantQdrant server
vectorstore-pineconePinecone (managed)
vectorstore-weaviateWeaviate server
all-vectorstoresAll of the above

Loaders

FeatureLoader
csv-loaderCSV
html-loaderHTML
yaml-loaderYAML
toml-loaderTOML
web-loaderHTTP fetch
pdf-loaderPDF
all-loadersAll loaders
InMemoryVectorStore, FakeEmbeddings, CachedEmbeddings, BatchedEmbeddings, the indexing pipeline, and the in-memory record manager are always available.

cognis-graph

FeaturePulls in
sqlitesqlx/sqlite for SqliteCheckpointer
postgressqlx/postgres for PostgresCheckpointer
serializer-cborCBOR-encoded checkpoint payloads
InMemoryCheckpointer and the engine are always available.

cognis-trace

FeaturePulls in
stdout (default)StdoutExporter
langfusereqwest, secrecy, base64, plus Langfuse exporter / prompts / scorer
allEvery exporter
integration_testsTests against real services (off by default)
MockExporter is always built — useful in tests.

Picking the minimum set

cognis = { version = "0.3", features = ["ollama"] }

See also

Installation

Where features live in your Cargo.toml.

Environment variables

What Client::from_env() reads.