TanStack AI version
0.8.1
Framework/Library version
TanStack Start v1.154.7, Node.js
Describe the bug and the steps to reproduce it
The @tanstack/ai-ollama adapter's mapCommonOptionsToOllama() method does not forward the systemPrompts field to the Ollama API. The core TextEngine.streamModelResponse() correctly passes systemPrompts in the options object to adapter.chatStream(), but the Ollama adapter only maps messages, model, options, and tools — options.systemPrompts is silently discarded.
This means calling chat() with systemPrompts has no effect when using the Ollama adapter:
import { chat } from '@tanstack/ai';
import { createOllamaChat } from '@tanstack/ai-ollama';
const stream = chat({
adapter: createOllamaChat('llama3', 'http://localhost:11434'),
messages: [{ role: 'user', content: 'Who are you?' }],
systemPrompts: ['You are a helpful pirate assistant. Always respond in pirate speak.'],
});
// Model responds normally — system prompt is never delivered to Ollama.
Expected:** The model responds in pirate speak, because the system prompt was forwarded.
Actual: The model responds normally with no awareness of the system prompt.
Root cause
In packages/typescript/ai-ollama/src/adapters/text.ts, the mapCommonOptionsToOllama() method constructs the ChatRequest without including systemPrompts. Other adapters handle this correctly — for example, the Anthropic adapter forwards system prompts via system: options.systemPrompts?.join('\n').
The Ollama ChatRequest type supports a system string field, so the fix is a one-line addition:
...(options.systemPrompts?.length
? { system: options.systemPrompts.join('\n') }
: {}),
I will be submitting a PR with this fix.
Workaround
Prepend the system prompt as a { role: 'system' } message in the messages array instead of using systemPrompts. This works because Ollama's API treats system-role messages as system instructions, but it bypasses the TanStack AI systemPrompts contract.
Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
The bug is visible by reading mapCommonOptionsToOllama() in packages/typescript/ai-ollama/src/adapters/text.ts — it never accesses options.systemPrompts. No external reproduction needed.
Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
Yes, I am also opening a PR that solves the problem along side this issue
Terms & Code of Conduct
TanStack AI version
0.8.1
Framework/Library version
TanStack Start v1.154.7, Node.js
Describe the bug and the steps to reproduce it
The
@tanstack/ai-ollamaadapter'smapCommonOptionsToOllama()method does not forward thesystemPromptsfield to the Ollama API. The coreTextEngine.streamModelResponse()correctly passessystemPromptsin the options object toadapter.chatStream(), but the Ollama adapter only mapsmessages,model,options, andtools—options.systemPromptsis silently discarded.This means calling
chat()withsystemPromptshas no effect when using the Ollama adapter:Expected:** The model responds in pirate speak, because the system prompt was forwarded.
Actual: The model responds normally with no awareness of the system prompt.
Root cause
In
packages/typescript/ai-ollama/src/adapters/text.ts, themapCommonOptionsToOllama()method constructs theChatRequestwithout includingsystemPrompts. Other adapters handle this correctly — for example, the Anthropic adapter forwards system prompts viasystem: options.systemPrompts?.join('\n').The Ollama
ChatRequesttype supports asystemstring field, so the fix is a one-line addition:I will be submitting a PR with this fix.
Workaround
Prepend the system prompt as a
{ role: 'system' }message in themessagesarray instead of usingsystemPrompts. This works because Ollama's API treats system-role messages as system instructions, but it bypasses the TanStack AIsystemPromptscontract.Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
The bug is visible by reading
mapCommonOptionsToOllama()in packages/typescript/ai-ollama/src/adapters/text.ts — it never accessesoptions.systemPrompts. No external reproduction needed.Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
Yes, I am also opening a PR that solves the problem along side this issue
Terms & Code of Conduct