Full Observability
for LLM Apps.

Real-time tracing, live metrics, cost tracking, and production monitoring. The observability platform for AI agents.

localhost:3001/trace/0x92f...

Total Runs

1,247

Avg Latency

342ms

Total Tokens

2.4M

Error Rate

0.3%

Active Sessions

Research Agent

Active 1m ago

RAG Pipeline

Active 2m ago

Analyst Tool

Active 3m ago

Trace Tree Explorer

AGENTresearch_coordinator
2.3s
LLMgpt-4o-mini
1.8s
TOOLweb_scraper
0.4s
Quick Start

Zero to Trace in seconds.

One import, one function call. Pure simplicity.

main.ts
import {collector, trace }from '@orka-js/collector';

// Start the collector dashboard
const { tracer } = await collector({ port: 3001 });

// Wrap your agent calls
const result = await trace.wrap('agent', 'research', async () => {
  return agent.run('Analyze market trends');
});

// → Open http://localhost:3001 to see traces

Everything you need.
Nothing you don't.

Trace Viewer

Visualize execution trees with nested runs, latency, and costs.

Live Metrics

Monitor latency, token usage, error rates, and costs in real-time.

Remote Tracing

Send traces from production to a central collector. Debug live issues.

Session Tracking

Organize traces into sessions and export execution history.

SSE Streaming

See traces appear in real-time as your agents execute.

Zero Config

One import, one function call. Works with all OrkaJS components.

Orka Ecosystem

Built for the Full Stack.

ReAct Agent

ReAct Agent

Autonomous reasoning loops

RAG Pipelines

RAG Pipelines

Context-aware knowledge retrieval

StateGraph

StateGraph

Complex workflow orchestration

Multi-Agent

Multi-Agent

Collaborative agent systems

Workflows

Workflows

Sequential task automation

Tools

Tools

External API & function calling

Ready to observe your AI apps?

Install @orka-js/collector and start visualizing your LLM traces in minutes.

$npm install @orka-js/collector
Read the docs