Anthropic Integration
Automatic tracing for Anthropic Claude API calls.
Quick Start
12345678910111213141516import traciumimport anthropic # Enable auto-instrumentationtracium.trace() # Use Anthropic normally - all calls are tracedclient = anthropic.Anthropic() message = client.messages.create( model="claude-3-opus-20240229", max_tokens=1024, messages=[ {"role": "user", "content": "Hello, Claude!"} ])What Gets Captured
- Input messages - The messages array sent to Claude
- System prompt - If provided
- Model - claude-3-opus, claude-3-sonnet, etc.
- Output - The response content
- Token usage - Input and output tokens
- Latency - API call duration
- Stop reason - Why generation stopped
Streaming
1234567891011121314import traciumimport anthropic tracium.trace()client = anthropic.Anthropic() # Streaming is fully supportedwith client.messages.stream( model="claude-3-sonnet-20240229", max_tokens=1024, messages=[{"role": "user", "content": "Write a story."}]) as stream: for text in stream.text_stream: print(text, end="", flush=True)Async Support
12345678910111213141516import traciumimport anthropicimport asyncio tracium.trace()client = anthropic.AsyncAnthropic() async def chat(prompt: str) -> str: message = await client.messages.create( model="claude-3-sonnet-20240229", max_tokens=1024, messages=[{"role": "user", "content": prompt}] ) return message.content[0].text result = asyncio.run(chat("Explain quantum computing."))Tool Use
123456789101112131415161718192021222324252627282930313233import traciumimport anthropic tracium.trace()client = anthropic.Anthropic() # Tool definitionstools = [ { "name": "get_weather", "description": "Get the current weather", "input_schema": { "type": "object", "properties": { "location": { "type": "string", "description": "City name" } }, "required": ["location"] } }] # Tool use is captured in the tracemessage = client.messages.create( model="claude-3-sonnet-20240229", max_tokens=1024, tools=tools, messages=[ {"role": "user", "content": "What's the weather in Paris?"} ])Multi-Turn Conversations
12345678910111213141516171819202122232425import traciumimport anthropic client = tracium.init()anthropic_client = anthropic.Anthropic() with client.agent_trace(agent_name="claude-chat") as trace: messages = [] # First turn messages.append({"role": "user", "content": "Hi, I'm learning Python."}) response = anthropic_client.messages.create( model="claude-3-sonnet-20240229", max_tokens=1024, messages=messages ) messages.append({"role": "assistant", "content": response.content[0].text}) # Second turn - all calls are linked in the same trace messages.append({"role": "user", "content": "Can you show me a loop example?"}) response = anthropic_client.messages.create( model="claude-3-sonnet-20240229", max_tokens=1024, messages=messages )