Skip to main content

Installation

npm install zeroeval

Basic Setup

Replace hardcoded prompt strings with ze.prompt(). Your existing text becomes the fallback content that’s used until an optimized version is available.
import * as ze from "zeroeval";
import { OpenAI } from "openai";

ze.init();
const client = ze.wrap(new OpenAI());

const systemPrompt = await ze.prompt({
  name: "support-bot",
  content: "You are a helpful customer support agent for {{company}}.",
  variables: { company: "TechCorp" },
});

const response = await client.chat.completions.create({
  model: "gpt-4",
  messages: [
    { role: "system", content: systemPrompt },
    { role: "user", content: "How do I reset my password?" },
  ],
});
Every call to ze.prompt() is tracked, versioned, and linked to the completions it produces. You’ll see production traces at ZeroEval → Prompts.
When you provide content, ZeroEval automatically uses the latest optimized version from your dashboard if one exists. The content parameter serves as a fallback for when no optimized versions are available yet.

Version Control

Auto-optimization (default)

const prompt = await ze.prompt({
  name: "customer-support",
  content: "You are a helpful assistant.",
});
Uses the latest optimized version if one exists, otherwise falls back to the provided content.

Explicit mode

const prompt = await ze.prompt({
  name: "customer-support",
  from: "explicit",
  content: "You are a helpful assistant.",
});
Always uses the provided content. Useful for debugging or A/B testing a specific version.

Latest mode

const prompt = await ze.prompt({
  name: "customer-support",
  from: "latest",
});
Requires an optimized version to exist. Fails with PromptRequestError if none is found.

Pin to a specific version

const prompt = await ze.prompt({
  name: "customer-support",
  from: "a1b2c3d4...", // 64-char SHA-256 hash
});

Parameters

ParameterTypeRequiredDefaultDescription
namestringYesTask name for this prompt
contentstringNoundefinedPrompt content (fallback or explicit)
fromstringNoundefined"latest", "explicit", or a 64-char SHA-256 hash
variablesRecord<string, string>NoundefinedTemplate variables for {{var}} tokens

Return value

Returns Promise<string> — a decorated prompt string with metadata that integrations use to link completions to prompt versions and auto-patch models.

Errors

ErrorWhen
ErrorBoth content and from provided (except from: "explicit"), or neither
PromptRequestErrorfrom: "latest" but no versions exist
PromptNotFoundErrorfrom is a hash that doesn’t exist

Model Deployments

When you deploy a model to a prompt version in the dashboard, the SDK automatically patches the model parameter in your LLM calls:
const systemPrompt = await ze.prompt({
  name: "support-bot",
  content: "You are a helpful customer support agent.",
});

const response = await client.chat.completions.create({
  model: "gpt-4", // Gets replaced with the deployed model
  messages: [
    { role: "system", content: systemPrompt },
    { role: "user", content: "Hello" },
  ],
});

Sending Feedback

Attach feedback to completions to power prompt optimization:
await ze.sendFeedback({
  promptSlug: "support-bot",
  completionId: response.id,
  thumbsUp: true,
  reason: "Clear and concise response",
});
ParameterTypeRequiredDescription
promptSlugstringYesPrompt name (same as used in ze.prompt())
completionIdstringYesUUID of the completion
thumbsUpbooleanYesPositive or negative feedback
reasonstringNoExplanation of the feedback
expectedOutputstringNoWhat the output should have been
metadataRecord<string, unknown>NoAdditional metadata