ze.prompt creates or fetches versioned prompts from the Prompt Library and returns decorated content for downstream LLM calls.
Parameters
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
name | string | yes | — | Task name associated with the prompt in the library |
content | string | no | None | Raw prompt content to ensure/create a version by content |
from_ | string | no | None | Either "latest" or a 64‑char lowercase SHA‑256 content hash to fetch a specific version |
from | string (alias) | no | None | Alias for from_ (keyword‑only) |
variables | dict | no | None | Template variables to render {{variable}} tokens in content |
- Exactly one of
contentorfrom_/frommust be provided. from="latest"fetches the latest version bound to the task; otherwisefrom_must be a 64‑char hex SHA‑256 hash.
Behavior
- content provided: Computes a normalized SHA‑256 hash, ensures a prompt version exists for
name, and returns decorated content. - from=“latest”: Fetches the latest version for
nameand returns decorated content. - from=
<hash>: Fetches by content hash fornameand returns decorated content.
task,prompt_slug,prompt_version,prompt_version_id,variables, and (when created by content)content_hash.
prompt_version_id is present, the SDK will automatically patch the model parameter to the model bound to that prompt version.
Return Value
str: Decorated prompt content ready to pass into LLM clients.
Errors
| Error | When |
|---|---|
ValueError | Both content and from_ provided, or neither; invalid from_ (not "latest" or 64‑char hex) |
PromptRequestError | from_="latest" but no versions exist for name |
PromptNotFoundError | from_ is a hash that does not exist for name |