Every piece of Cognis — prompts, models, parsers, tools, retrievers, agents, compiled graphs — implements the same trait:Documentation Index
Fetch the complete documentation index at: https://cognis.vasanth.xyz/llms.txt
Use this file to discover all available pages before exploring further.
Runnable<I, O>. If you’ve used pipes in a shell, you already know the mental model. The difference is that the types flow through the composition, so the compiler catches plumbing mistakes before you run anything.
What it is
invoke and you get batch (concurrent multi-input), stream (per-output stream), stream_events (lifecycle events), and a default name() for free.
Why typed
In Python, every Runnable shrugs and types its input/output asAny. In Cognis, a model is Runnable<Vec<Message>, Message>. A parser is Runnable<Message, Recipe>. Pipe them and the compiler refuses anything that doesn’t line up:
serde_json::Value until something actually serializes — usually at a system boundary like an HTTP response or a checkpoint write. Inside your composition, types are real.
When to care
- You’re building a chain by hand and want compile-time guarantees.
- You’re writing a custom primitive (a fancy retriever, a domain-specific parser) and want everything else to compose with it for free.
- You want consistent observability —
stream_eventsand observers work on any Runnable.
AgentBuilder, you don’t usually instantiate Runnable directly. But the agent itself, the tools it calls, and the LLM client it wraps are all Runnable underneath, which is why the same wrappers (retry, timeout, fallback) work on all of them.
Quick example
A custom Runnable that doubles a number:examples/v2/01_hello_runnable.rs.
Composition
Once you have a Runnable, compose it three ways:- Pipe (sequential)
- Each (per-element)
- Lambda (ad-hoc)
Branch) and parallel fan-out (Parallel) live in cognis_core::compose with the same shape — see Reference → cognis-core.
Wrappers
Cross-cutting behavior comes throughRunnableExt, in scope via the prelude:
| Method | Effect |
|---|---|
pipe(next) | Sequential composition. |
with_retry(policy) / with_max_retries(n) | Retry on Err. |
with_timeout(Duration) | Bound a single invoke. |
with_fallback(other) | Try a backup on error. |
with_memory_cache(key_fn) | Hash-keyed in-memory cache. |
each() | Apply per-element to a Vec<I>. |
Streaming events
Every Runnable can emit a structured event stream — useful for trace UIs, progress bars, and the observability stack.OnStart, OnEnd, OnError, OnNodeStart, OnNodeEnd, OnLlmToken, OnToolStart, OnToolEnd, OnCheckpoint, Custom.
How it works
invokeis the only required method. Everything else has a default that calls it.stream_eventsis the source of trace data. Observers attached viaRunnableConfig::with_observersee every event.- Composition returns concrete types.
pipe(a, b)isPipe<A, B, …>, not a trait object — no boxing in the hot path. - Wrappers are wrappers.
with_retryreturnsRetry<Self, …>, which is itself a Runnable. They compose.
See also
Messages and content
What flows through chat-shaped Runnables.
Building agents → Streaming
stream for tokens, stream_events for structure.Reference → cognis-core
The full method list and provided defaults.