Loading workspace insights... Statistics interval
7 days30 daysLatest CI Pipeline Executions
bea27964 feat: native streaming structured output for openai, grok, groq
Adds `structuredOutputStream` to `@tanstack/ai-openai`,
`@tanstack/ai-grok`, and `@tanstack/ai-groq`, mirroring the openrouter
reference: a single request with `stream: true` +
`response_format: json_schema` (Chat Completions for grok/groq) or
`text.format: json_schema` (Responses API for openai), no tools, raw
JSON deltas as `TEXT_MESSAGE_CONTENT` plus a terminal `CUSTOM`
`structured-output.complete` event with `{ object, raw }`.
- Always-finalize on upstream close so truncated streams never hang
consumers
- Typed `RUN_ERROR` paths: `empty-response`, `parse-error`, `aborted`,
plus mid-stream provider errors (terminal — no `RUN_FINISHED` after)
- `transformNullsToUndefined` applied on parse for parity with the
non-streaming `structuredOutput`
- E2E feature-support matrix: openai/grok/groq join openrouter for
`structured-output-stream`; the existing parameterized spec now runs
against all four
- ts-react-chat example: `api.structured-output.ts` and the matching
page gain a provider selector (openai/grok/groq/openrouter) and a
Stream toggle that consumes SSE, renders deltas live, and snaps to
the parsed object on the terminal CUSTOM event
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>