v1.0.1 - Production Ready

Automatic Tracing for LLM Applications

Add observability to your AI applications with one line of code. Tracium automatically traces OpenAI, Anthropic, Google AI, LangChain, and more.

Quick Install

pip install tracium

One-Line Setup

Add tracing to your entire application with a single function call:

app.pypython
1
2
3
4
5
6
7
8
9
10
11
12
13
14
import tracium
# Initialize with auto-instrumentation
tracium.trace(api_key="sk_live_...")
# That's it! All LLM calls are now traced
from openai import OpenAI
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}]
)
# ^ This call is automatically traced

Supported Integrations

Tracium automatically detects and instruments these libraries:

OpenAI
Anthropic
Google AI
LangChain
LangGraph

Manual Tracing

For more control, use explicit traces and spans:

manual_tracing.pypython
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
import tracium
client = tracium.init(api_key="sk_live_...")
with client.agent_trace(agent_name="support-bot") as trace:
# Record a planning span
with trace.span(span_type="plan", name="analyze_request") as span:
span.record_input({"query": "How do I reset my password?"})
plan = analyze_user_request(query)
span.record_output({"plan": plan})
# Record an LLM call span
with trace.span(span_type="llm", name="generate_response") as span:
span.record_input({"prompt": plan})
response = generate_response(plan)
span.record_output({"response": response})
span.set_token_usage(input_tokens=150, output_tokens=200)

Ready to get started?

Set up Tracium in your project in under 5 minutes.

View Installation Guide