Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognis.vasanth.xyz/llms.txt

Use this file to discover all available pages before exploring further.

Resilience examples enact production patterns for keeping an LLM app up. Sources under examples/resilience/.
NameScenarioSource
resilience_error_handlingUser submits an age string; show how parse errors propagate through a Runnable chain so the caller can surface a friendly message.src
resilience_error_recoveryURL shortener that fails twice then succeeds — exponential backoff retry keeps the user-facing call working.src
resilience_rate_limitersCost-based budget — cap LLM spend at $1/min using TokenBucket / SlidingWindow / CostBased / Composite.src
resilience_ssrf_protectionURL-fetch tool that gets a URL from user input — reject internal IPs, localhost, and AWS metadata before any HTTP call.src

How to run

cargo run -p cognis-examples --example resilience_ssrf_protection
cargo run -p cognis-examples --example resilience_rate_limiters

See also

Resilience guide

Wrappers, middleware, recovery patterns.

Caching

Don’t pay twice.

Security

PII, deny-lists, sandboxing.