Parameters and configuration for ze.prompt
ze.prompt
creates or fetches versioned prompts from the Prompt Library and returns decorated content for downstream LLM calls.
Parameter | Type | Required | Default | Description |
---|---|---|---|---|
name | string | yes | — | Task name associated with the prompt in the library |
content | string | no | None | Raw prompt content to ensure/create a version by content |
from_ | string | no | None | Either "latest" or a 64‑char lowercase SHA‑256 content hash to fetch a specific version |
from | string (alias) | no | None | Alias for from_ (keyword‑only) |
variables | dict | no | None | Template variables to render {{variable}} tokens in content |
content
or from_/from
must be provided.from="latest"
fetches the latest version bound to the task; otherwise from_
must be a 64‑char hex SHA‑256 hash.name
, and returns decorated content.name
and returns decorated content.<hash>
: Fetches by content hash for name
and returns decorated content.task
, prompt_slug
, prompt_version
, prompt_version_id
, variables
, and (when created by content) content_hash
.prompt_version_id
is present, the SDK will automatically patch the model
parameter to the model bound to that prompt version.
str
: Decorated prompt content ready to pass into LLM clients.Error | When |
---|---|
ValueError | Both content and from_ provided, or neither; invalid from_ (not "latest" or 64‑char hex) |
PromptRequestError | from_="latest" but no versions exist for name |
PromptNotFoundError | from_ is a hash that does not exist for name |