New Version Available

Visual Debugging
for LLM Apps.

Real-time tracing, prompt versioning, and cost observability. Like Chrome DevTools, but for your AI agents.

localhost:3001/trace/0x92f...

Total Runs

1,247

Avg Latency

342ms

Total Tokens

2.4M

Error Rate

0.3%

Active Sessions

Research Agent

Active 1m ago

RAG Pipeline

Active 2m ago

Analyst Tool

Active 3m ago

Trace Tree Explorer

AGENTresearch_coordinator
2.3s
LLMgpt-4o-mini
1.8s
TOOLweb_scraper
0.4s

Start in seconds

One import, one function call. That's it.

import { devtools, trace } from '@orka-js/devtools';

// Start the DevTools dashboard
await devtools({ port: 3001 });

// Wrap your agent calls
const result = await trace.wrap('agent', 'research', async () => {
  return agent.run("Analyze market trends");
});

// → Open http://localhost:3001 to see traces

Everything you need to debug LLM apps

Professional-grade observability tools, built for AI developers

Trace Viewer

Visualize execution trees with nested runs, latency, tokens, and costs for every LLM call.

Real-time Metrics

Monitor latency, token usage, error rates, and costs across all your agents and workflows.

Replay Debugging

Replay any trace with modified inputs to debug issues and test improvements.

Session Management

Organize traces into sessions, export/import data, and track execution history.

SSE Streaming

See traces appear in real-time as your agents execute with Server-Sent Events.

Zero Config

One line to start. Works with all OrkaJS components: agents, RAG, graphs, workflows.

Works with your stack

Integrates seamlessly with the entire OrkaJS ecosystem

🤖

ReAct Agent

📚

RAG Pipelines

🔀

StateGraph

👥

Multi-Agent

🧠

OpenAI

🔮

Anthropic

⚙️

Workflows

🛠️

Tools

Ready to debug your AI apps?

Install @orka-js/devtools and start visualizing your LLM traces in minutes.

$npm install @orka-js/devtools
Read the docs