Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.zeroeval.com/llms.txt

Use this file to discover all available pages before exploring further.

The ZeroEval TypeScript SDK provides automatic tracing for popular AI libraries through the wrap() function.

OpenAI

Wrap your OpenAI client to automatically trace all API calls:
import { OpenAI } from 'openai';
import * as ze from 'zeroeval';

ze.init();
const openai = ze.wrap(new OpenAI());

// Chat completions are automatically traced
const completion = await openai.chat.completions.create({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Hello!' }]
});

// Streaming is also automatically traced
const stream = await openai.chat.completions.create({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Tell me a story' }],
  stream: true
});

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content || '');
}

Supported Methods

The OpenAI integration automatically traces:
  • chat.completions.create() (streaming and non-streaming)
  • embeddings.create()
  • images.generate(), images.edit(), images.createVariation()
  • audio.transcriptions.create(), audio.translations.create()

Vercel AI SDK

Wrap the Vercel AI SDK module to trace all AI operations:
import * as ai from 'ai';
import { openai } from '@ai-sdk/openai';
import * as ze from 'zeroeval';

ze.init();
const wrappedAI = ze.wrap(ai);

// Text generation
const { text } = await wrappedAI.generateText({
  model: openai('gpt-4'),
  prompt: 'Write a haiku about coding'
});

// Streaming
const { textStream } = await wrappedAI.streamText({
  model: openai('gpt-4'),
  messages: [{ role: 'user', content: 'Hello!' }]
});

for await (const delta of textStream) {
  process.stdout.write(delta);
}

// Structured output
import { z } from 'zod';

const { object } = await wrappedAI.generateObject({
  model: openai('gpt-4'),
  schema: z.object({
    name: z.string(),
    age: z.number()
  }),
  prompt: 'Generate a random person'
});

Supported Methods

The Vercel AI SDK integration automatically traces:
  • generateText(), streamText()
  • generateObject(), streamObject()
  • embed(), embedMany()
  • generateImage()
  • transcribe()
  • generateSpeech()

LangChain / LangGraph

Use the callback handler for LangChain and LangGraph applications:
import { 
  ZeroEvalCallbackHandler, 
  setGlobalCallbackHandler 
} from 'zeroeval/langchain';

// Option 1: Set globally (recommended)
setGlobalCallbackHandler(new ZeroEvalCallbackHandler());

// All chain invocations are now automatically traced
const result = await chain.invoke({ topic: 'AI' });
import { ZeroEvalCallbackHandler } from 'zeroeval/langchain';

// Option 2: Per-invocation
const handler = new ZeroEvalCallbackHandler();
const result = await chain.invoke(
  { topic: 'AI' },
  { callbacks: [handler] }
);

Claude Agent SDK

Anthropic’s @anthropic-ai/claude-agent-sdk is natively traced. Each agent query() call becomes its own ZeroEval trace, and all turns in the same Claude conversation are grouped into one ZeroEval session via the claude_session_id attribute.
npm install @anthropic-ai/claude-agent-sdk
import * as ze from 'zeroeval';

ze.init();

const claudeAgentSdk = await import('@anthropic-ai/claude-agent-sdk');
const sdk = ze.wrapClaudeAgentSdk(claudeAgentSdk);

for await (const message of sdk.query({
  prompt: 'What files are in this directory?',
  options: { allowedTools: ['Bash', 'Glob'] },
})) {
  if ('result' in message && message.type === 'result') {
    console.log('Result:', message.result);
  }
}

Wrap only the query function

If you only need to trace query() calls, you can wrap the function directly:
import * as ze from 'zeroeval';
import { query } from '@anthropic-ai/claude-agent-sdk';

ze.init();

const tracedQuery = ze.wrapClaudeAgentQuery(query);

for await (const message of tracedQuery({
  prompt: 'List all TypeScript files',
  options: { allowedTools: ['Glob'] },
})) {
  if ('result' in message && message.type === 'result') {
    console.log('Result:', message.result);
  }
}

What gets traced

The Claude Agent SDK integration automatically captures:
  • Assistant output text and tool use summaries
  • Token usage, total cost, and stop reason
  • Permission decisions from canUseTool callbacks
  • Hook metadata
  • Stream event counts when partial messages are enabled
  • Session grouping across multi-turn conversations via claude_session_id
Only public Claude Agent SDK APIs are instrumented — private transport and protocol internals are not patched.

Auto-Detection

The wrap() function automatically detects which client you’re wrapping:
import { OpenAI } from 'openai';
import * as ai from 'ai';
import * as ze from 'zeroeval';

ze.init();

// Automatically detected as OpenAI client
const openai = ze.wrap(new OpenAI());

// Automatically detected as Vercel AI SDK
const wrappedAI = ze.wrap(ai);

// Automatically detected as Claude Agent SDK
const claudeAgentSdk = await import('@anthropic-ai/claude-agent-sdk');
const sdk = ze.wrap(claudeAgentSdk);
If ze.init() hasn’t been called and ZEROEVAL_API_KEY is set in your environment, the SDK will automatically initialize when you first use wrap().

Using with Prompts

The integrations automatically extract ZeroEval metadata from prompts created with ze.prompt():
import { OpenAI } from 'openai';
import * as ze from 'zeroeval';

ze.init();
const openai = ze.wrap(new OpenAI());

// Create a version-tracked prompt
const systemPrompt = await ze.prompt({
  name: 'customer-support',
  content: 'You are a helpful customer support agent for {{company}}.',
  variables: { company: 'TechCorp' }
});

// The integration automatically:
// 1. Extracts the prompt metadata
// 2. Links the completion to the prompt version
// 3. Patches the model if one is bound to the prompt version
const response = await openai.chat.completions.create({
  model: 'gpt-4',  // May be replaced by bound model
  messages: [
    { role: 'system', content: systemPrompt },
    { role: 'user', content: 'I need help with my order' }
  ]
});
Need help? Check out our GitHub examples or reach out on Discord.