Production-hardened infrastructure. The model is your decision.
OpenAI, Anthropic, Google, Groq, Azure OpenAI, Ollama, or any OpenAI-compatible endpoint. One environment variable selects the provider. The session engine, tool graph, and skill runner behave identically regardless of which model is underneath.
Execute plain Markdown skill files — the same format as Claude Code skills. Multi-step workflows activate with a single API call; the kernel drives each step and advances automatically. Skills are yours. They travel to any model.
Postgres-backed conversation history, resumable across requests. Hybrid vector + keyword retrieval over session context. Sessions carry their own tool graph, persona config, and skill state — persistent and portable.
Full MCP client over SSE, streamable HTTP, and stdio. Register any MCP server at runtime — the kernel discovers tools automatically and binds them into the session tool graph. Your domain layer stays outside the kernel.
Multi-stage Docker build — compiled output only, no source in the runtime image. Mount your skills directory at runtime. Pair with on-prem Ollama for a fully offline stack. Designed for environments where data sovereignty, network isolation, or regulatory compliance is non-negotiable.
Token-level streaming with mid-stream tool invocation. Server-sent events over a single persistent connection. Built-in tool start/result/done event types. Resilient — guards against write-after-close across all tool paths.
Agent behavior is identical regardless of which model is underneath.
The SDK is the runtime layer: session engine, tool graph, skill runner, streaming. Domain intelligence lives above it. Chain-of-Provenance, Constellation, Auto Analyst, and the AI Scientist expert network are Universitas AI platform features.
This is intentional. The agent engine runs anywhere. The platform is where primary sources, expert networks, and provenance chains live. Use both together for the full stack, or the SDK alone against your own data layer.
Contact us to discuss platform access alongside SDK licensing.
Closed distribution. Issued on request. Source is never included in the container SKU.
All tiers include open interface specifications: REST API, SKILL.md format, MCP registration, and InferenceProvider config schema.