Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognis.vasanth.xyz/llms.txt

Use this file to discover all available pages before exploring further.

cognis-llm defines the LLMProvider trait and ships clients for the major vendors. It also defines the Tool trait that the agent layer dispatches.

Crate metadata

FieldValue
Latest version0.3
docs.rsdocs.rs/cognis-llm
Repo pathcrates/cognis-llm
Default featuresopenai, ollama

Modules at a glance

ModuleWhat
clientClient, ClientBuilder. The provider-agnostic surface.
providerLLMProvider trait, Provider enum, plus per-vendor builders (openai::OpenAIBuilder, anthropic::AnthropicBuilder, etc.).
chatChatOptions, ChatResponse, Usage, StreamChunk, HealthStatus.
streamingAggregated, StreamAggregator, UsageTracker.
toolsTool (alias BaseTool), SchemaBasedTool, ToolDefinition, ToolInput, ToolOutput, ToolRegistry.

Key types

Client

pub struct Client { /* … */ }

impl Client {
    pub fn from_env() -> Result<Self>;
    pub fn builder() -> ClientBuilder;
    pub fn new(provider: Arc<dyn LLMProvider>) -> Self;

    pub fn provider(&self) -> &dyn LLMProvider;

    pub async fn invoke(&self, messages: Vec<Message>) -> Result<Message>;
    pub async fn stream(&self, messages: Vec<Message>) -> Result<RunnableStream<StreamChunk>>;
    pub async fn chat(&self, messages: Vec<Message>, opts: ChatOptions) -> Result<ChatResponse>;
    pub async fn invoke_with_tools(&self, messages: Vec<Message>, tools: &[Arc<dyn Tool>]) -> Result<Message>;
}
Client implements Runnable<Vec<Message>, Message> — wrap with RunnableExt methods for retry / timeout / fallback / cache.

ClientBuilder

MethodPurpose
provider(Provider)One of OpenAi, Anthropic, Google, Ollama, Azure, OpenRouter.
api_key(String)Provider key.
base_url(String)Override the API base.
model(String)Default model name.
timeout_secs(u64)HTTP timeout.
organization(String)OpenAI org id.
azure_endpoint(String) / azure_deployment(String) / azure_api_version(String)Azure-specific.
build()Result<Client>.

LLMProvider

#[async_trait]
pub trait LLMProvider: Send + Sync {
    fn name(&self) -> &str;
    fn provider_type(&self) -> Provider;
    async fn chat_completion(&self, messages: Vec<Message>, opts: ChatOptions) -> Result<ChatResponse>;
    async fn chat_completion_stream(&self, messages: Vec<Message>, opts: ChatOptions) -> Result<RunnableStream<StreamChunk>>;
    async fn chat_completion_with_tools(&self, messages: Vec<Message>, tools: Vec<ToolDefinition>, opts: ChatOptions) -> Result<ChatResponse>;
    async fn health_check(&self) -> Result<HealthStatus>;
}
Implement this for custom backends — internal gateways, mock providers in tests, self-hosted runtimes.

Provider builders

Each lives under cognis_llm::provider::* and returns a value that’s wrapped into a Client via Client::new(Arc::new(provider)).
  • openai::OpenAIBuilderapi_key, base_url, model, timeout_secs, organization.
  • anthropic::AnthropicBuilderapi_key, base_url, model, timeout_secs.
  • google::GoogleBuilderapi_key, base_url, model, timeout_secs.
  • ollama::OllamaBuilderbase_url, model, timeout_secs.
  • azure::AzureBuilderendpoint, deployment, api_version, api_key, timeout_secs.
  • openrouter::OpenRouterBuilderapi_key, model, extra_header(name, value).

Tool trait

#[async_trait]
pub trait Tool: Send + Sync {
    fn name(&self) -> &str;
    fn description(&self) -> &str;
    fn args_schema(&self) -> Option<serde_json::Value>;
    fn return_direct(&self) -> bool { false }
    async fn _run(&self, input: ToolInput) -> Result<ToolOutput>;
}

pub use Tool as BaseTool;
SchemaBasedTool is a convenience layer: declare type Params: JsonSchema, implement execute_typed, get a Tool impl free.

ToolRegistry

pub struct ToolRegistry { /* … */ }

impl ToolRegistry {
    pub fn new() -> Self;
    pub fn register(&mut self, tool: Arc<dyn Tool>);
    pub fn register_alias(&mut self, alias: impl Into<String>, name: &str);
    pub fn get(&self, name: &str) -> Option<&Arc<dyn Tool>>;
    pub fn definitions(&self) -> Vec<ToolDefinition>;
    pub async fn execute(&self, name: &str, input: ToolInput) -> Result<ToolOutput>;
    // …
}
The agent’s tool dispatcher uses a ToolRegistry internally; you usually don’t construct one directly.

Feature flags

FeaturePulls in
openaireqwest, secrecy, OpenAI client (default).
anthropicAnthropic Messages client.
googleGemini client.
ollamaOllama client (default).
azureAzure OpenAI client.
all-providersAll of the above.

See also

Models and providers

User-facing guide for Client and the provider builders.

Tools

Defining tools with Tool and SchemaBasedTool.