OpenAI Responses + Function Calling.
Adapter for OpenAI Responses (and the older Chat Completions tool path). Each function invocation is scope-checked against the active OC Agent delegation, then emits a signed agent-action envelope with the canonical input hash.
What it does
OpenAI's function-calling and Responses APIs both produce tool-call deltas that name a function and pass JSON arguments. The adapter sits between the OpenAI SDK and your function handlers:
- Pre-call: parse the function name + arguments, derive an exercised scope, verify it is admissible against the delegation. Refuses the call if not.
- Canonicalize the call, hash, BIP-322 sign as the agent address. Emit an agent-action envelope (kind 30084,
oc-agent-act:d-tag). - Return the function result to OpenAI's tool-result block, with the envelope id available for downstream replay.
Compatibility
Single adapter handles both code paths:
- Responses API — newer, streaming-first, structured tool deltas. Adapter listens to the response stream, inserts envelope emission inline.
- Chat Completions tool calls — legacy but widely deployed. Adapter wraps the synchronous response.
Both paths target the same agent-action envelope shape, so audit exports do not branch on which OpenAI surface produced them.
Status
Adapter package: @orangecheck/agent-openai@0.2.0 is on npm. Source in oc-packages/agent-openai. MIT. Ships stampFunctionCall, postActionToFleet, and invokeWithStampAndPost — sub-scope checking and canonicalization come from @orangecheck/agent-core; this package is the OpenAI-specific glue.
What Fleet adds on top: the managed signing pipeline, OC Stamp anchoring, Nostr publishing, audit-bundle assembly, and the operator dashboard.
Already using the package?
Fleet adds the managed pipeline (OC Stamp anchoring, Nostr publishing, audit-bundle assembly, operator dashboard) on top of the open package. Design partners pair with an OrangeCheck engineer for the integration.