Skip to content

Create migration guide from AI SDK to TanStack#179

Merged
tannerlinsley merged 9 commits intomainfrom
claude/migration-guide-versace-tanstack-NqpkX
Apr 20, 2026
Merged

Create migration guide from AI SDK to TanStack#179
tannerlinsley merged 9 commits intomainfrom
claude/migration-guide-versace-tanstack-NqpkX

Conversation

@tannerlinsley
Copy link
Copy Markdown
Member

@tannerlinsley tannerlinsley commented Dec 23, 2025

Add comprehensive migration guide covering:

  • Package installation differences
  • Server-side API migration (streamText -> chat)
  • Client-side useChat hook differences
  • Isomorphic tool system migration
  • Provider adapter changes (OpenAI, Anthropic, Gemini)
  • Streaming response formats
  • Multimodal content handling
  • Type safety enhancements
  • Complete before/after code examples

🎯 Changes

✅ Checklist

  • I have followed the steps in the Contributing guide.
  • I have tested this code locally with pnpm run test:pr.

🚀 Release Impact

  • This change affects published code, and I have generated a changeset.
  • This change is docs/CI/dev-only (no release).

Summary by CodeRabbit

  • Documentation
    • Added a comprehensive migration guide for moving from Vercel AI SDK to TanStack AI, covering installation, API/config mappings, server & client migration patterns, streaming/structured outputs, tool/function-calling, middleware, observability, typed usage, and examples.
    • Guide is now listed under the Migration section in the docs navigation.

Add comprehensive migration guide covering:
- Package installation differences
- Server-side API migration (streamText -> chat)
- Client-side useChat hook differences
- Isomorphic tool system migration
- Provider adapter changes (OpenAI, Anthropic, Gemini)
- Streaming response formats
- Multimodal content handling
- Type safety enhancements
- Complete before/after code examples
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Dec 23, 2025

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 510da8a9-e95e-48da-b444-3c4922b6c30a

📥 Commits

Reviewing files that changed from the base of the PR and between 711dc15 and 45fe37b.

📒 Files selected for processing (1)
  • docs/migration/migration-from-vercel-ai.md

📝 Walkthrough

Walkthrough

Added a docs navigation entry and a new, extensive migration guide documenting how to port projects from the Vercel AI SDK (ai, @ai-sdk/*) to TanStack AI, covering server/client APIs, tools, streaming, structured outputs, agents, middleware, and provider adapters.

Changes

Cohort / File(s) Summary
Documentation navigation
docs/config.json
Added a "From Vercel AI SDK" child under the Migration section pointing to the new migration guide.
Migration guide
docs/migration/migration-from-vercel-ai.md
Added a comprehensive migration document mapping Vercel AI SDK concepts to TanStack AI: installation, server/client API translations, streaming and SSE/HTTP adapters, structured outputs, tool/function-calling patterns, agent loop strategies, middleware lifecycle, provider adapters, cancellation, multimodal notes, examples, and links.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 I hopped from Vercel’s latest trail,

To TanStack fields where docs prevail.
With mappings, adapters, and a stream so bright,
I stitched the paths from night to light.
Happy migrations — hop along! 🥕

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Description check ⚠️ Warning The description is largely incomplete with most checklist items unchecked and the template section left as placeholder comments. Complete the template by filling in the 'Changes' section with detailed explanation and mark relevant checklist items as completed based on actual work done.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and concisely describes the main change: adding a migration guide from the Vercel AI SDK to TanStack AI.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch claude/migration-guide-versace-tanstack-NqpkX

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

📜 Review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 29466c1 and 60e69d9.

📒 Files selected for processing (1)
  • docs/guides/migration-from-vercel-ai.md
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Implement framework integrations using the headless `tanstack/ai-client` for state management with framework-specific hooks (useChat) on top
📚 Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/index.ts : Export tree-shakeable adapters with clear subpath exports in package.json (e.g., `tanstack/ai/adapters`, `tanstack/ai-openai/adapters`) to minimize bundle size

Applied to files:

  • docs/guides/migration-from-vercel-ai.md
📚 Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Implement framework integrations using the headless `tanstack/ai-client` for state management with framework-specific hooks (useChat) on top

Applied to files:

  • docs/guides/migration-from-vercel-ai.md
🪛 markdownlint-cli2 (0.18.1)
docs/guides/migration-from-vercel-ai.md

614-614: Heading levels should only increment by one level at a time
Expected: h3; Actual: h4

(MD001, heading-increment)

🔇 Additional comments (7)
docs/guides/migration-from-vercel-ai.md (7)

1-928: Comprehensive migration guide is well-structured and thorough.

The document provides excellent coverage of the migration path with:

  • Clear before/after patterns for nearly every API surface
  • Consistent terminology and organization
  • Helpful rationale for architectural differences
  • Real-world examples (tools, multimodal, streaming)
  • Acknowledgment of removed features with workarounds

The main work is verifying the TanStack AI API surface matches what's documented. Once the above verification items are confirmed, this will be a high-quality migration resource.


403-425: The client tools API usage is correct. clientTools is properly exported from @tanstack/ai-client, accepts individual tool instances created with .client(), and integrates correctly with useChat via the tools parameter.


345-374: Tool definition API is correctly documented.

The toolDefinition, .server(), and .client() exports and patterns shown in the example match the actual TanStack AI implementation. No corrections needed.


559-559: Remove streamToText() from the utilities list—this function does not exist in the TanStack AI SDK.

The utilities toStreamResponse, toServerSentEventsStream, and toHttpStream are properly exported from @tanstack/ai. Both fetchServerSentEvents and fetchHttpStream are available in @tanstack/ai-client. However, streamToText() does not appear in any official TanStack AI documentation or API references. Verify that this function is actually used in the migration guide, or correct the reference to use an appropriate alternative utility if needed.


451-466: The tool approval API structure is correct. The approval-requested state is valid, and tool-call parts with approval-requested state contain an approval.id property that should be passed to addToolApprovalResponse({ id, approved }).


74-74: The migration guide uses correct adapter export names and signatures. Verification confirms:

  • openaiText(), openaiImage(), openaiSpeech(), anthropicText(), geminiText() are the canonical function names (not method syntax)
  • All are individual, tree-shakeable function exports from separate adapter files
  • All follow consistent function call patterns: adapterName('model-id')

934-937: All referenced documentation links are valid and correct.

Verification confirms that all four documentation files referenced in the help section exist at their specified relative paths:

  • ../getting-started/quick-startdocs/getting-started/quick-start.md
  • ./toolsdocs/guides/tools.md
  • ./connection-adaptersdocs/guides/connection-adapters.md
  • ../api/aidocs/api/ai.md

No fixes needed.

Comment thread docs/migration/migration-from-vercel-ai.md
Comment thread docs/guides/migration-from-vercel-ai.md Outdated
Comment thread docs/migration/migration-from-vercel-ai.md
@tannerlinsley tannerlinsley changed the title Create migration guide from Versace to TanStack Create migration guide from AI SDK to TanStack Dec 23, 2025
Comment thread docs/guides/migration-from-vercel-ai.md Outdated
Comment thread docs/guides/migration-from-vercel-ai.md Outdated
```typescript
const stream = chat({
adapter: openaiText('gpt-4o'),
messages: [
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not correct, we pass it on the root as well, i think its called systemPrompts

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@claude fix this

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in ebd689c. You were right — chat() takes systemPrompts: string[] at the root. Verified against packages/typescript/ai/src/types.ts:572 and the chat options at packages/typescript/ai/src/activities/chat/index.ts. Also updated the final complete example (CodeRabbit flagged the same issue at line 928). The System Messages section now shows the multi-prompt form too, since systemPrompts is an array.

Comment thread docs/guides/migration-from-vercel-ai.md Outdated
| Vercel AI SDK | TanStack AI | Notes |
|--------------|-------------|-------|
| `api: '/api/chat'` | `connection: fetchServerSentEvents('/api/chat')` | Explicit connection adapter |
| `input`, `handleInputChange` | Manage state yourself | More control, less magic |
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems hallucinated?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@claude What are you up to here?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in ebd689c. You were right to flag this — the useChat API comparison was mixing Vercel v4 (input/handleInputChange/handleSubmit) with the TanStack headless-state philosophy, which made the claim "Same pattern" for append/reload/stop/setMessages misleading. Vercel AI SDK v5+ already moved to headless state (sendMessage, status, regenerate, DefaultChatTransport), so I rewrote the whole table against the current v5+ surface and the actually-exported useChat return from packages/typescript/ai-react/src/use-chat.ts (sendMessage, append, reload, stop, isLoading, setMessages, addToolResult, addToolApprovalResponse). The basic useChat example above the table was updated to match.

@jherr
Copy link
Copy Markdown
Contributor

jherr commented Jan 7, 2026

What's the status on this one? We have some Claude comments but Claude didn't do anything about it. :( Bad Claude, bad!

- Installation: add @ai-sdk/react to v5+ deps and update quick-reference
  table to show the v5 framework package names
- System prompts: use root-level systemPrompts: [...] instead of
  prepending a system message to the messages array (verified against
  packages/typescript/ai/src/types.ts)
- useChat API table: rewrite against current Vercel AI SDK v5+ API
  (sendMessage, status, regenerate, DefaultChatTransport) so the
  comparison is accurate rather than mixing v4/v5
- MessagePart: expand to full discriminated union with real field names
  (arguments/input/approval on tool-call, content on tool-result) and
  real ToolCallState values
- Fix nonexistent toStreamResponse references -> toServerSentEventsResponse
  (and add toHttpResponse where appropriate)
- Fix AbortController section heading (h4 -> h3, resolves MD001)
- Update tool schema section to note parameters -> inputSchema rename
  in AI SDK v5
- Tighten tool approval example with optional chaining and a note on
  arguments vs parsed input
@nx-cloud
Copy link
Copy Markdown

nx-cloud Bot commented Apr 20, 2026

View your CI Pipeline Execution ↗ for commit 45fe37b

Command Status Duration Result
nx run-many --targets=build --exclude=examples/** ✅ Succeeded 3s View ↗

☁️ Nx Cloud last updated this comment at 2026-04-20 17:39:11 UTC

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 20, 2026

🚀 Changeset Version Preview

No changeset entries found. Merging this PR will not cause a version bump for any packages.

@pkg-pr-new
Copy link
Copy Markdown

pkg-pr-new Bot commented Apr 20, 2026

Open in StackBlitz

@tanstack/ai

npm i https://pkg.pr.new/@tanstack/ai@179

@tanstack/ai-anthropic

npm i https://pkg.pr.new/@tanstack/ai-anthropic@179

@tanstack/ai-client

npm i https://pkg.pr.new/@tanstack/ai-client@179

@tanstack/ai-code-mode

npm i https://pkg.pr.new/@tanstack/ai-code-mode@179

@tanstack/ai-code-mode-skills

npm i https://pkg.pr.new/@tanstack/ai-code-mode-skills@179

@tanstack/ai-devtools-core

npm i https://pkg.pr.new/@tanstack/ai-devtools-core@179

@tanstack/ai-elevenlabs

npm i https://pkg.pr.new/@tanstack/ai-elevenlabs@179

@tanstack/ai-event-client

npm i https://pkg.pr.new/@tanstack/ai-event-client@179

@tanstack/ai-fal

npm i https://pkg.pr.new/@tanstack/ai-fal@179

@tanstack/ai-gemini

npm i https://pkg.pr.new/@tanstack/ai-gemini@179

@tanstack/ai-grok

npm i https://pkg.pr.new/@tanstack/ai-grok@179

@tanstack/ai-groq

npm i https://pkg.pr.new/@tanstack/ai-groq@179

@tanstack/ai-isolate-cloudflare

npm i https://pkg.pr.new/@tanstack/ai-isolate-cloudflare@179

@tanstack/ai-isolate-node

npm i https://pkg.pr.new/@tanstack/ai-isolate-node@179

@tanstack/ai-isolate-quickjs

npm i https://pkg.pr.new/@tanstack/ai-isolate-quickjs@179

@tanstack/ai-ollama

npm i https://pkg.pr.new/@tanstack/ai-ollama@179

@tanstack/ai-openai

npm i https://pkg.pr.new/@tanstack/ai-openai@179

@tanstack/ai-openrouter

npm i https://pkg.pr.new/@tanstack/ai-openrouter@179

@tanstack/ai-preact

npm i https://pkg.pr.new/@tanstack/ai-preact@179

@tanstack/ai-react

npm i https://pkg.pr.new/@tanstack/ai-react@179

@tanstack/ai-react-ui

npm i https://pkg.pr.new/@tanstack/ai-react-ui@179

@tanstack/ai-solid

npm i https://pkg.pr.new/@tanstack/ai-solid@179

@tanstack/ai-solid-ui

npm i https://pkg.pr.new/@tanstack/ai-solid-ui@179

@tanstack/ai-svelte

npm i https://pkg.pr.new/@tanstack/ai-svelte@179

@tanstack/ai-vue

npm i https://pkg.pr.new/@tanstack/ai-vue@179

@tanstack/ai-vue-ui

npm i https://pkg.pr.new/@tanstack/ai-vue-ui@179

@tanstack/preact-ai-devtools

npm i https://pkg.pr.new/@tanstack/preact-ai-devtools@179

@tanstack/react-ai-devtools

npm i https://pkg.pr.new/@tanstack/react-ai-devtools@179

@tanstack/solid-ai-devtools

npm i https://pkg.pr.new/@tanstack/solid-ai-devtools@179

commit: 45fe37b

@AlemTuzlak
Copy link
Copy Markdown
Contributor

Addressed the review feedback in ebd689c. Summary of what changed and why, after verifying every API claim against the current TanStack AI source and the Vercel AI SDK v5+ docs:

Human feedback (Alem + Tanner)

  • Line 39 — ai/react missing. Added @ai-sdk/react to the v5+ install command. The Quick Reference table rows for framework hooks now point at the actual v5+ packages (@ai-sdk/react, @ai-sdk/vue, @ai-sdk/solid, @ai-sdk/svelte) with a note about the v4 → v5 subpath move.
  • Line 150 — System prompt wrong. chat() takes systemPrompts: string[] at the root (verified in packages/typescript/ai/src/types.ts:572). Replaced the prepended-system-message pattern in both the System Messages section and the Complete Example. Added an example of multi-prompt composition since it's an array.
  • Line 231 — "This seems hallucinated?". The useChat API table was comparing Vercel v4 magic (input/handleInputChange/handleSubmit) with TanStack's headless model, then claiming append/reload/stop/setMessages were "Same pattern". Vercel v5+ already went headless, so the table is rewritten against the v5+ surface and the actual TanStack useChat return from packages/typescript/ai-react/src/use-chat.ts (sendMessage, append, reload, stop, isLoading, setMessages, addToolResult, addToolApprovalResponse).

CodeRabbit feedback

  • MessagePart type incomplete (line 266). Expanded to the real discriminated union from packages/typescript/ai/src/types.ts:248-287 and packages/typescript/ai-client/src/types.ts:32-132: text, tool-call (with id, name, arguments JSON string, optional parsed input, state, approval, output), tool-result (with toolCallId, content, state, error), and thinking. Documented real ToolCallState values (awaiting-input | input-streaming | input-complete | approval-requested | approval-responded) and ToolResultState values. Added a note clarifying TanStack does not have separate reasoning/source-url/source-document/file part types.
  • Heading level jump (line 614, MD001). AbortController subsections now use ### like every other section.
  • systemPrompts in final example (line 928). Fixed in the same pass as the System Messages section.

Other corrections found while verifying

  • toStreamResponse does not exist in @tanstack/ai. The real exports are streamToText, toServerSentEventsStream/toServerSentEventsResponse, toHttpStream/toHttpResponse (see packages/typescript/ai/src/stream-to-response.ts and packages/typescript/ai/src/index.ts:56-60). Replaced every toStreamResponse reference, including in the Streaming Responses section, the Complete Example, and the Basic Text Generation example. Also added toHttpResponse as a documented option.
  • Before-examples for Vercel now reflect v5+ (@ai-sdk/react, DefaultChatTransport, sendMessage, status, maxOutputTokens, providerOptions, inputSchema on tool()) with a v4 note where naming changed.
  • Tool approval example tightened: uses optional chaining, narrows with part.approval, and notes the difference between raw arguments (streaming JSON string) and parsed input.

I already pushed the commit; the PR is green for re-review.

# Conflicts:
#	docs/tools/migration-from-vercel-ai.md
…rage

- Add exhaustive streamText -> chat() option mapping table covering
  every AI SDK v6 parameter (tools, toolChoice, activeTools, stopWhen,
  prepareStep, experimental_transform/context/telemetry/repairToolCall,
  all sampling controls, abort, headers, providerOptions -> modelOptions)
- Add streamText result -> TanStack equivalent table (textStream,
  fullStream, text, usage, finishReason, steps, toUIMessageStreamResponse,
  pipeTextStreamToResponse, consumeStream, etc.)
- Expand Generation Options with topK/presence/frequency/seed/stop under
  modelOptions, clarify flat typed modelOptions vs provider-keyed
  providerOptions
- New section: Structured Output (generateObject / streamObject / v6
  Output.object) -> outputSchema on chat(); notes on Standard Schema
  libraries, provider strategies, and the current gap for partial
  object streaming
- New section: Agent Loop Control — stopWhen / hasToolCall / stepCountIs
  mapped to maxIterations / untilFinishReason / combineStrategies, and
  prepareStep mapped to middleware onConfig/onIteration
- New section: Middleware — wrapLanguageModel + experimental_transform
  mapped to a single ChatMiddleware array; full hook inventory;
  toolCacheMiddleware usage; common-pattern mapping table
- New section: Observability — where to plug logging/metrics/tracing
- Update generateText coverage to chat({ stream: false }) returning a
  real Promise<string> (not just streamToText)
- Update Tool Approval "Before" to show AI SDK v6's native needsApproval
  + sendAutomaticallyWhen flow; the two APIs are now symmetric
- Reframe "Removed Features" -> "Features Not Yet Covered" and scope
  it to embeddings, partial-object streaming, built-in retries/timeouts
- Update frontmatter for the docs/migration/ location (order, description,
  keywords); fix cross-links to the new directory layout
  (../advanced/middleware, ../chat/structured-outputs, etc.)
Factual corrections verified against source:
- Multimodal image source shape uses { type: 'url'|'data', value, mimeType }
  not { url, base64, mediaType } (types.ts:142-183)
- toolCacheMiddleware is exported from @tanstack/ai/middlewares, not the
  root (packages/typescript/ai/src/middlewares/index.ts)
- toolDefinition({ description }) is required; add it to the two doc
  examples that were missing it (tool-definition.ts:31)
- stream() connection adapter factory is (messages, data?) with no
  signal arg; rewrite custom-adapter example (connection-adapters.ts:441)

AI SDK v6 accuracy:
- addToolResult -> addToolOutput (v6 rename)
- experimental_output -> output (de-experimentalized)
- Soften "replaced" claim about generateObject/streamObject — they are
  deprecated, not removed
- Vercel addToolApprovalResponse row: v6 has this; replace "N/A"
- First Basic Text Generation Before example now uses v5+ API
  (convertToModelMessages + toUIMessageStreamResponse) with a v4
  toDataStreamResponse callout

Consistency:
- Agent-loop tables reconciled: only one truly built-in strategy
  (maxIterations / untilFinishReason / combineStrategies); hasToolCall
  requires a custom AgentLoopStrategy. Both tables now agree.
- prepareStep Before/After actually demonstrates equivalent behavior:
  Before shows step-level config tweak, After uses onConfig;
  mid-loop model switching split into its own subsection with the
  two-chat pattern the prose describes
- Message Structure section qualifies that ToolCallPart.input is the
  ai-client projection (server-side reads arguments directly)
- toHttpStream/Response comment in client connection example clarified
- Complete Example clarifies why convertToModelMessages disappears in
  the After (chat() accepts UI messages directly)
- clientTools() auto-execution comment expanded to state that no
  onToolCall/addToolOutput call is needed
- Anchor slug for Structured Output simplified to #structured-output

Rot hygiene:
- "current releases" removed from v5/v6 note
- "Every option" softened to "Options accepted ... as of AI SDK v6"
- "now expose" / "AI SDK v6 offers" / "v6 consolidated" reworded to
  avoid tense decay across future releases
@AlemTuzlak
Copy link
Copy Markdown
Contributor

Ran the CR loop on the expanded guide (3 specialized review agents, 2 rounds).

Round 1 → 15 actionable findings, all fixed in 711dc15

Factual corrections against the TanStack AI source:

  • Multimodal image source shape is { type: 'url'|'data', value, mimeType }, not { url/base64/mediaType } (verified at packages/typescript/ai/src/types.ts:142-183).
  • toolCacheMiddleware is exported from @tanstack/ai/middlewares, not @tanstack/ai (subpath export in packages/typescript/ai/package.json).
  • toolDefinition({ description }) is required, not optional (tool-definition.ts:31) — two examples were missing it.
  • stream() connection-adapter factory takes (messages, data?), no signal (connection-adapters.ts:441).
  • ToolCallPart.input (parsed) only exists on the ai-client projection; core reads arguments (raw JSON). Message Structure section now scoped to ai-client.

AI SDK v6 accuracy:

  • addToolResultaddToolOutput (v6 rename, verified at chatbot-tool-usage).
  • experimental_outputoutput (de-experimentalized; verified at generate-text reference).
  • Softened "replaced" claim about generateObject/streamObject — they're deprecated but still present in v6.
  • Vercel addToolApprovalResponse row was marked N/A in the useChat table but used in the Tool Approval Before example; table fixed.
  • First Basic Text Generation Before now uses v5+ API (convertToModelMessages + toUIMessageStreamResponse()) with a v4 note.

Internal consistency:

  • Agent-loop tables reconciled: maxIterations / untilFinishReason / combineStrategies are the built-in strategies; hasToolCall needs a one-line custom strategy. Both tables now say the same thing.
  • prepareStep Before/After rewritten so the two snippets are actually equivalent. Mid-loop model switching split into a subsection with the two-chat pattern the prose describes.
  • Complete Example explains why convertToModelMessages disappears in the After.
  • clientTools() "// Auto-executed" comment expanded to make explicit that no onToolCall/addToolOutput handler is needed.
  • toHttpStream/toHttpResponse pairing clarified on the client side.
  • Anchor slug for Structured Output simplified to `#structured-output` (avoids slugifier ambiguity with punctuation-heavy heading).

Rot hygiene:

  • "current releases" / "now expose" / "AI SDK v6 offers" / "v6 consolidated" reworded so the guide doesn't silently decay across future AI SDK releases.

False positive from the review pass: one agent flagged "Embeddings: Not Yet Covered" as factually wrong because CLAUDE.md mentioned openaiEmbed / embedding(). I verified nothing with those names exists in packages/typescript/ai/src/ or packages/typescript/ai-openai/src/ today. Left the doc claim as-is.

Round 2 → 0 findings. Each round-1 fix verified against the source; fresh scan found no regressions. Ready for re-review.

Both helpers accept ResponseInit & { abortController }, so custom headers,
status, and cancellation flow through the helpers directly. Drop the
hand-rolled `new Response(toServerSentEventsStream(...), { headers: {...} })`
example and keep the raw stream helpers only for the genuine "pipe elsewhere"
case.
@tannerlinsley tannerlinsley merged commit 3681c9e into main Apr 20, 2026
8 checks passed
@tannerlinsley tannerlinsley deleted the claude/migration-guide-versace-tanstack-NqpkX branch April 20, 2026 18:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants