Use case
A support ticket comes in containing an email, a phone number, and a complaint that spans billing and a product bug. A triage agent classifies it, hands off to specialists, and each specialist runs tools in the same sandbox — so filesystem state persists across the handoff.Template
base — the smallest template, boots fastest. All this recipe
needs is a shell and /workspace.
Run it
Security policy
rehydrate_response=True is the key knob for support workflows.
Walk-through:
- User ticket contains
alice.johnson@example.comand+1-415-555-0182. - Triage agent needs to send the ticket to the LLM. Outbound
request is scanned — email and phone become
REDACTED_EMAIL_ADDRESS_1,REDACTED_PHONE_NUMBER_1. - LLM reasons about the redacted ticket. Its response echoes the tokens back in the reply draft.
- Edge proxy rehydrates the tokens before the sandbox receives the response. The agent’s drafted reply now contains the real email and phone.
- Specialist writes the final reply to
/workspace/reply.txtwith the real PII — correct for the customer, while the LLM never saw real values.
Multi-agent handoff
session. Triage writes
/workspace/ticket.txt; billing reads it; billing writes
/workspace/billing.log; technical appends to
/workspace/reply.txt. That would not work if each agent got its
own sandbox.
Env isolation
printenv so they branch on tier
(gold/silver/bronze) or agent version without the dispatcher
needing to pass them in the prompt.
Full source
Seecookbook/openai_agents_customer_support.py in the repo.