TanStack
OSS
ai
Sign in / Sign up
Open main menu
ai
GitHub
Overview
Runs
Analytics
Loading workspace stats
Loading workspace insights...
Statistics interval
7 days
30 days
Latest CI Pipeline Executions
Status
Fix filter
Filter
Fuzzy
Filter range
Sort by
Sort by
Start time
Sort ascending
Sort descending
Succeeded
claude/add-usechat-server-functions-pBsj5
3c0092eb revert(ai): drop chat() messages type widening from this PR Pulls the @tanstack/ai changes back out — the chat() messages-option widening to accept UIMessage[] is a separate concern from the stream() server-function feature this PR is about. Restores the example's `as any` cast with a comment, drops the @tanstack/ai minor bump from the changeset, and reverts chat/index.ts to its pre-PR state. Also bumps the example adapter to gpt-5.2. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
3c0092eb Merge e773e88fcb2025871d3bbd3e28ff2dd0e1b10ba4 into ff338557d9ea54c960617f52c89c488140d60f85
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
b821f2ed feat(ai-client,ai): narrow stream() callback to UIMessage[] and accept UIMessage[] in chat() Drops the `as any` / `as Array<UIMessage>` casts previously needed when wiring useChat through a TanStack Start server function into chat(). The stream() factory now declares Array<UIMessage> (with a runtime assert matching the ChatClient invariant), and chat()'s messages option accepts UIMessage[] directly alongside ConstrainedModelMessage[] — the runtime already normalised both via convertMessagesToModelMessages. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
b821f2ed Merge cb39b842aa3db6adf7d896b362391ab798d5daa3 into ff338557d9ea54c960617f52c89c488140d60f85
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
87dba45b fix(ai-client): filter SSE non-payload fields and drop unterminated trailing buffer Refine the SSE parser so the throw-on-parse-failure behavior doesn't regress on legitimate SSE traffic: - parseSSEChunks now skips SSE comment lines (`:`) and non-data fields (`event:`, `id:`, `retry:`) which proxies and CDNs commonly inject as keepalives. Previously these would have flowed into JSON.parse and thrown, killing otherwise-healthy streams behind any infrastructure that injects SSE control lines. - readStreamLines no longer yields the unterminated trailing buffer at stream end. A non-empty buffer means the connection was cut mid-line, so the content is partial by definition — yielding it would feed truncated JSON to the parser and surface a misleading RUN_ERROR for what is really a transport-layer issue. Warn and discard instead. Bare-line JSON (legacy/raw mode) is still accepted to preserve the existing public behavior covered by the `should handle SSE format without data: prefix` test. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
87dba45b fix(ai-client): filter SSE non-payload fields and drop unterminated trailing buffer Refine the SSE parser so the throw-on-parse-failure behavior doesn't regress on legitimate SSE traffic: - parseSSEChunks now skips SSE comment lines (`:`) and non-data fields (`event:`, `id:`, `retry:`) which proxies and CDNs commonly inject as keepalives. Previously these would have flowed into JSON.parse and thrown, killing otherwise-healthy streams behind any infrastructure that injects SSE control lines. - readStreamLines no longer yields the unterminated trailing buffer at stream end. A non-empty buffer means the connection was cut mid-line, so the content is partial by definition — yielding it would feed truncated JSON to the parser and surface a misleading RUN_ERROR for what is really a transport-layer issue. Warn and discard instead. Bare-line JSON (legacy/raw mode) is still accepted to preserve the existing public behavior covered by the `should handle SSE format without data: prefix` test. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
ad3d414f fix(ai-client): use EventType enum in server-function stream tests Type-check was failing on CI because the new server-function and RPC async-iterable test fixtures yielded raw string literals for chunk type, which don't satisfy the EventType enum required by StreamChunk. Switch to the enum and add the required threadId to RUN_FINISHED. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
ad3d414f Merge e0d0b9c7249a48dc5ef5a927cde5b8f736e4bad2 into ff338557d9ea54c960617f52c89c488140d60f85
by Tom Beckenham
T
Failed
claude/add-usechat-server-functions-pBsj5
9b0c7a60 Merge ca99cc91a091a40e1f7a20d82b26412304017eda into ff338557d9ea54c960617f52c89c488140d60f85
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
9b0c7a60 fix: revert ai-fal package to upstream state The previous merge commit accidentally included stale references in @tanstack/ai-fal that this PR shouldn't have touched. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Failed
claude/add-usechat-server-functions-pBsj5
4a5aecb6 ci: apply automated fixes
by autofix-ci...
a
Failed
claude/add-usechat-server-functions-pBsj5
4a5aecb6 Merge fdbd45ae5bfe16203d8826150ba7a8852a999eab into ff338557d9ea54c960617f52c89c488140d60f85
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
eb4bbfc5 docs(ai-client): reframe stream() server-function description Lead with what stream() does (typed RPC into useChat), instead of calling a server function "just a fancy/async API endpoint." Same edits applied to the changeset and the stream() JSDoc. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Succeeded
claude/add-usechat-server-functions-pBsj5
eb4bbfc5 docs(ai-client): reframe stream() server-function description Lead with what stream() does (typed RPC into useChat), instead of calling a server function "just a fancy/async API endpoint." Same edits applied to the changeset and the stream() JSDoc. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
by Tom Beckenham
T
Previous page
Previous
Next
Next page