Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.zavu.dev/llms.txt

Use this file to discover all available pages before exploring further.

defineTool

defineTool declares an action the agent can take. The LLM reads the description to decide when to call it, and the parameters schema to decide with what arguments. Your handler runs in the function’s Lambda and returns the result.
import { defineTool } from "@zavu/functions"

defineTool({
  name: "check_availability",
  description:
    "Check available reservation slots for a date and party size. " +
    "Call before create_reservation.",
  parameters: {
    type: "object",
    properties: {
      date: {
        type: "string",
        description: "YYYY-MM-DD or 'today', 'tomorrow', 'friday'.",
      },
      partySize: {
        type: "number",
        description: "Number of people.",
      },
    },
    required: ["date", "partySize"],
  },
  handler: async ({ date, partySize }, ctx) => {
    // Look up your DB / POS / external API.
    return {
      available: true,
      slots: ["19:00", "21:00"],
      date,
      partySize,
    }
  },
})

How the agent decides

The agent’s LLM picks tools based on three signals:
  1. description — primary signal. Write it as the answer to “when should the model call this?”. Be specific about preconditions.
  2. parameters — JSON Schema. Each property’s description helps the LLM fill in the right value.
  3. name — secondary signal. Use snake_case action verbs: check_availability, create_reservation, cancel_reservation.
Bad description:
description: "Booking stuff"
Good description:
description:
  "Book a confirmed reservation. ONLY call AFTER check_availability has " +
  "returned the requested slot. Requires customer name."

Required fields

FieldTypeNotes
namestringUnique within the agent. snake_case.
descriptionstringTells the LLM when to call this. ≤ 500 chars in practice.
parametersJSON Schema (object){ type: "object", properties, required }.
handlerasync function(args, ctx) => result. Return any JSON-serializable value.

Optional fields

FieldDefaultDescription
agentinferredOnly needed if your file defines multiple defineAgents.
enabledtrueSet false to deploy the tool but keep it disabled.

Parameters schema

Use JSON Schema. The agent’s LLM is trained to fill in any standard schema — keep it simple.

Primitive types

parameters: {
  type: "object",
  properties: {
    orderId:    { type: "string",  description: "Order ID like ORD-12345." },
    quantity:   { type: "number",  description: "How many units." },
    expedited:  { type: "boolean", description: "Pay extra for next-day shipping." },
    tags:       { type: "array", items: { type: "string" } },
  },
  required: ["orderId"],
}

Enums

{
  type: "string",
  enum: ["pending", "shipped", "delivered", "cancelled"],
  description: "Filter by status.",
}

Nested objects

{
  type: "object",
  properties: {
    address: {
      type: "object",
      properties: {
        street: { type: "string" },
        city:   { type: "string" },
        zip:    { type: "string" },
      },
      required: ["street", "city", "zip"],
    },
  },
}

Free-form metadata

When you want to accept arbitrary key/value pairs:
{
  type: "object",
  description: "Extra metadata, arbitrary keys.",
  additionalProperties: { type: "string" },
}
The LLM follows the schema strictly. If a field is required, the LLM will ask the user follow-up questions until it has the value. Use this — it removes a lot of validation from your handler.

Handler signature

handler: async (args, ctx) => {
  // args: typed by your schema (use `as Type` for safety)
  // ctx: invocation context — see below
  return { /* anything JSON-serializable */ }
}

ctx properties

PropertyTypeWhen set
projectIdstringAlways.
functionIdstringAlways.
slugstringAlways.
messageIdstringWhen invoked by the agent (not on manual HTTP call).
contactPhonestringThe customer’s phone number (E.164).
sessionIdstring | nullIf the agent is in a flow session.
logfunctionEquivalent to console.log but tagged for the dashboard logs panel.

Return value

The handler’s return value goes back to the LLM as the tool result. The LLM includes it in the next message it composes for the customer.
return {
  confirmed: true,
  reservationId: "RES-LXXY",
  summary: "Marco, party of 2, friday at 21:00",
}
Becomes (paraphrased) “Listo Marco, reserva RES-LXXY confirmada para 2 personas el viernes a las 21:00.”
The LLM reads field names. Return semantically named fields (confirmed, summary, eta_minutes) rather than IDs and codes only. The natural-language answer it generates is better when the structure is self-documenting.

Error handling

Throwing from a handler returns success: false to the LLM with the message as the error. The LLM usually translates this into “Sorry, that didn’t work because…” for the user.
defineTool({
  name: "cancel_reservation",
  ...,
  handler: async ({ reservationId }) => {
    const r = await db.reservations.findById(reservationId)
    if (!r) {
      throw new Error("Reservation not found. Double-check the ID.")
    }
    if (r.status === "completed") {
      throw new Error("That reservation already happened — nothing to cancel.")
    }
    await db.reservations.cancel(reservationId)
    return { cancelled: true }
  },
})
Prefer returning a structured error over throwing when the user can recover:
return {
  error: "slot_taken",
  message: "21:00 was just booked by someone else.",
  alternatives: ["19:30", "22:30"],
}
The LLM sees the alternatives and offers them naturally.

Calling Zavu’s own API from a tool

Every function has a ZAVU_API_KEY env var injected automatically. Use it to call your Zavu account:
import Zavudev from "@zavudev/sdk"
import { defineTool } from "@zavu/functions"

const zavu = new Zavudev({
  apiKey: process.env.ZAVU_API_KEY!,
  baseURL: process.env.ZAVU_API_BASE_URL,
})

defineTool({
  name: "send_thankyou",
  description: "Send a follow-up WhatsApp after a reservation is created.",
  parameters: {
    type: "object",
    properties: {
      phone: { type: "string" },
      customerName: { type: "string" },
    },
    required: ["phone", "customerName"],
  },
  handler: async ({ phone, customerName }) => {
    await zavu.messages.send({
      to: phone,
      channel: "whatsapp",
      text: `Gracias por reservar con nosotros, ${customerName}! 🍕`,
    })
    return { sent: true }
  },
})
The auto-key has messages:send, messages:read, contacts:read scopes. For other operations, create a project-scoped API key in the dashboard and inject it as a secret.

Calling external services

Standard fetch. The Lambda has unrestricted egress (today).
defineTool({
  name: "lookup_order",
  description: "Get the latest status of an order from our POS.",
  parameters: { type: "object", properties: { orderId: { type: "string" } }, required: ["orderId"] },
  handler: async ({ orderId }) => {
    const res = await fetch(`https://pos.example.com/orders/${orderId}`, {
      headers: { authorization: `Bearer ${process.env.POS_API_KEY}` },
    })
    if (!res.ok) throw new Error(`POS returned ${res.status}`)
    return await res.json()
  },
})

Adding npm dependencies

Edit package.json:
{
  "dependencies": {
    "openai": "^4.20.0",
    "zod": "^3.22.0"
  }
}
zavu deploy installs them server-side during the bundle step (no local npm install required). Lambda gets a self-contained zip with the deps.
Keep dependencies tight. Each unused package adds cold-start latency and zip size. The runtime layer already ships @zavudev/sdk, hono, dayjs, zod, and a few others — declaring them again is unnecessary.

Testing locally

zavu fn invoke --event message.inbound \
  --from +14155551234 --text "menú vegano"
This runs your defineFunction handler with a synthetic message event, no AWS round-trip. Tool calls invoked by the LLM aren’t simulated in local invoke — for that, use the deployed function and zavu fn logs --tail.

Common patterns

Don’t return huge arrays — the LLM has limited context. Filter and trim server-side:
handler: async ({ filter }) => {
  const all = await db.menu.findAll()
  const filtered = filter === "vegan" ? all.filter(m => m.vegan) : all
  return {
    items: filtered.slice(0, 20).map(m => ({
      name: m.name, price: `$${m.price}`, vegan: m.vegan
    })),
    total: filtered.length,
  }
}
Lambda has a 30s budget by default (configurable up to 15min). For longer work, return immediately with a job ID and let the LLM follow up:
defineTool({
  name: "start_export",
  handler: async () => {
    const jobId = await queue.enqueue({ kind: "export" })
    return {
      started: true,
      jobId,
      message: "Started. Use check_export(jobId) to see progress."
    }
  },
})
Tools can be called multiple times (LLM retries, user repeats request). Use the tool’s natural keys to dedupe:
handler: async ({ orderId, action }) => {
  const result = await db.actions.upsert({
    key: `${orderId}:${action}`,
    runOnce: () => doTheThing(orderId),
  })
  return result
}
For destructive operations, return a confirmRequired payload first:
defineTool({
  name: "cancel_reservation",
  handler: async ({ reservationId, confirm }) => {
    if (!confirm) {
      const r = await db.reservations.findById(reservationId)
      return {
        confirmRequired: true,
        summary: `Cancel reservation for ${r.customerName} on ${r.date}?`,
      }
    }
    await db.reservations.cancel(reservationId)
    return { cancelled: true }
  },
})
The LLM will summarize the cancellation and wait for the user to confirm before calling again with confirm: true.

Next

Secrets

Inject DB credentials and API keys.

Restaurant example

Full booking agent walked through.