feat: Complete DeepSeek API adapter implementation#11130
feat: Complete DeepSeek API adapter implementation#11130dixoxib wants to merge 44 commits intocontinuedev:mainfrom
Conversation
…model provider updates - Add DeepSeek model provider implementation with proper FIM support - Implement DeepSeek API adapter with OpenAI-compatible interface - Add tool call support and thinking mode integration - Update model provider configuration and onboarding - Add comprehensive type definitions and validation - Update documentation for DeepSeek model capabilities - Fix file rename from Deepseek.ts to DeepSeek.ts for consistency
… DeepSeek integration
… comprehensive tests
- Enhance DeepSeek provider with improved FIM support - Update onboarding configuration for DeepSeek - Refactor token counting and autodetection logic - Improve system tool call interception - Streamline chat response streaming - Update UI and extension components for better integration
- Refactor token counting logic - Enhance DeepSeek provider capabilities - Update chat and edit templates - Improve system tool call interception - Streamline API adapters and converters - Add unit test enhancements
Integrate latest upstream changes including: - zAI provider support - Background job service - CLI tool improvements - Updated model configurations - Maintain DeepSeek integration
- Rewrite setupProviderConfig to use direct model config instead of uses/with syntax - Add roles property to model config schema - Improve DeepSeek message conversion with toChatBody - Update conversationCompaction imports - Add roles field to config schema for VS Code - Update DeepSeek provider description - Add debug logging to DeepSeekApi adapter
- Simplify DeepSeek onboarding config to use direct model objects - Improve token counting for DeepSeek models (reserve full maxTokens) - Fix FIM support tests for DeepSeek models - Adjust maxTokens for DeepSeek Reasoner to 32k (API limit) - Update DeepSeek provider info with correct context lengths - Enhance DeepSeek converters to handle max_completion_tokens - Clean up imports and schema definitions - Update documentation with accurate DeepSeek capabilities - Remove debug console logs from DeepSeek adapter
|
I have read the CLA Document and I hereby sign the CLA 1 out of 2 committers have signed the CLA. |
There was a problem hiding this comment.
6 issues found across 35 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="core/config/yaml/models.ts">
<violation number="1" location="core/config/yaml/models.ts:55">
P1: Forcing `contextLength` to `DEFAULT_CONTEXT_LENGTH` in YAML model conversion can override provider runtime autodetection (e.g., Ollama), causing incorrect context windows for unknown/autodetected models.</violation>
</file>
<file name="packages/openai-adapters/src/types.ts">
<violation number="1" location="packages/openai-adapters/src/types.ts:63">
P1: `deepseek` was added to `OpenAIConfigSchema` even though `DeepseekConfigSchema` already handles that discriminator in `LLMConfigSchema`, creating a duplicate discriminator value in the same `z.discriminatedUnion`.</violation>
</file>
<file name="packages/openai-adapters/src/apis/DeepSeek.ts">
<violation number="1" location="packages/openai-adapters/src/apis/DeepSeek.ts:317">
P2: FIM completions are beta-only and require the beta base URL. Building the endpoint as `/completions` against the default base (`https://api.deepseek.com/`) no longer targets the beta API, so FIM will fail unless callers override apiBase manually.</violation>
</file>
<file name="packages/openai-adapters/src/util/deepseek-converters.ts">
<violation number="1" location="packages/openai-adapters/src/util/deepseek-converters.ts:327">
P2: `top_logprobs` is included even when `logprobs` is false/undefined, but DeepSeek requires `logprobs=true` whenever `top_logprobs` is used. This can generate invalid requests that the API rejects.</violation>
</file>
<file name="core/llm/llms/DeepSeek.ts">
<violation number="1" location="core/llm/llms/DeepSeek.ts:73">
P2: Non-streaming chat ignores top-level options.model, unlike streamChat/streamFim, causing inconsistent model selection.</violation>
<violation number="2" location="core/llm/llms/DeepSeek.ts:217">
P2: supportsPrefill() reports true unconditionally even though DeepSeek prefill is beta-only. This misreports capability on non‑beta API bases and can enable beta-only prompt paths against non‑beta endpoints.</violation>
</file>
Since this is your first cubic review, here's how it works:
- cubic automatically reviews your code and comments on bugs and improvements
- Teach cubic by replying to its comments. cubic learns from your replies and gets better over time
- Add one-off context when rerunning by tagging
@cubic-dev-aiwith guidance or docs links (includingllms.txt) - Ask questions if you need clarification on any suggestion
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
…plicate deepseek discriminator
|
I have read the CLA Document and I hereby sign the CLA |
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
…roviders - Remove trailing slash normalization in DeepSeekApi constructor - Update test expectations to match actual behavior - All tests pass, including main integration test
There was a problem hiding this comment.
2 issues found across 2 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/openai-adapters/src/apis/DeepSeek.ts">
<violation number="1" location="packages/openai-adapters/src/apis/DeepSeek.ts:50">
P2: apiBase trailing-slash normalization was removed; relative URL construction can drop path segments for custom bases like https://host/v1, misrouting requests.</violation>
<violation number="2" location="packages/openai-adapters/src/apis/DeepSeek.ts:316">
P2: FIM is a beta feature that requires the beta base URL; building the endpoint as `new URL("completions", this.apiBase)` sends FIM requests to the non‑beta `/completions` path when using the default base URL, which breaks FIM. Restore beta-aware routing or force the beta base URL for FIM calls.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
- Update test expectations to match actual regex logic - deepseek-chat matches /deepseek/ and /r1|reasoner|-chat/ regex combo - Keep deepseek-coder as non-recommended
- Fix code style issues in DeepSeek.ts and DeepSeekApi.test.ts - Ensure consistent formatting before PR merge
…licit type annotation from static defaultOptions to ensure proper inheritance\n- Remove overridden chat() method that bypassed adapter disable in tests\n- Tests now pass: default API base correctly detected and chat requests use mocked fetch
- Move applyToolOverrides import to avoid potential circular dependencies - Filter chat() accumulation to only assistant messages (ignore thinking messages) - Maintains backward compatibility with existing functionality
There was a problem hiding this comment.
1 issue found across 2 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/openai-adapters/src/apis/DeepSeek.ts">
<violation number="1" location="packages/openai-adapters/src/apis/DeepSeek.ts:123">
P2: Chat endpoint base normalization is applied to all chat requests, causing `/beta/`-scoped `apiBase` values to be stripped for normal chat and potentially breaking custom `/beta/` deployments.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
…for custom "/beta/" deployments.
There was a problem hiding this comment.
2 issues found across 5 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/openai-adapters/src/apis/DeepSeek.ts">
<violation number="1" location="packages/openai-adapters/src/apis/DeepSeek.ts:123">
P1: Removed `/beta/` base normalization causes malformed or incorrect chat endpoints for clients configured with `apiBase` ending in `/beta/`.</violation>
<violation number="2" location="packages/openai-adapters/src/apis/DeepSeek.ts:321">
P1: FIM endpoint now double-prefixes `beta` when `apiBase` already includes `/beta/`, breaking beta-configured clients.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
|
|
||
| const endpoint = new URL( | ||
| isPrefixCompletion ? "beta/chat/completions" : "chat/completions", | ||
| this.apiBase, |
There was a problem hiding this comment.
P1: Removed /beta/ base normalization causes malformed or incorrect chat endpoints for clients configured with apiBase ending in /beta/.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/openai-adapters/src/apis/DeepSeek.ts, line 123:
<comment>Removed `/beta/` base normalization causes malformed or incorrect chat endpoints for clients configured with `apiBase` ending in `/beta/`.</comment>
<file context>
@@ -120,7 +120,7 @@ export class DeepSeekApi extends OpenAIApi {
const endpoint = new URL(
isPrefixCompletion ? "beta/chat/completions" : "chat/completions",
- this.apiBase.endsWith("/beta/") ? this.apiBase.slice(0, -5) : this.apiBase,
+ this.apiBase,
);
</file context>
| this.apiBase, | |
| this.apiBase.endsWith("/beta/") ? this.apiBase.slice(0, -5) : this.apiBase, |
There was a problem hiding this comment.
"/beta/…" is an unique prefill endpoint of the DeepSeek API. If you want to remove it for custom deployments, an apiBase provider check could be added
const apiBaseBeta = apiBase.contains("api.deepseek.com") ? apiBase + "beta/" : apiBase;
…
new URL( isPrefixCompletion ? apiBaseBeta : apiBase, …
| ): AsyncGenerator<ChatCompletionChunk, any, unknown> { | ||
| ): AsyncGenerator<ChatCompletionChunk> { | ||
| const warnings: string[] = []; | ||
| const endpoint = new URL("beta/completions", this.apiBase); |
There was a problem hiding this comment.
P1: FIM endpoint now double-prefixes beta when apiBase already includes /beta/, breaking beta-configured clients.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/openai-adapters/src/apis/DeepSeek.ts, line 321:
<comment>FIM endpoint now double-prefixes `beta` when `apiBase` already includes `/beta/`, breaking beta-configured clients.</comment>
<file context>
@@ -318,10 +318,7 @@ export class DeepSeekApi extends OpenAIApi {
- this.apiBase.endsWith("/beta/") ? "completions" : "beta/completions",
- this.apiBase,
- );
+ const endpoint = new URL("beta/completions", this.apiBase);
const deepSeekBody = convertToFimDeepSeekRequestBody(body, warnings);
</file context>
| const endpoint = new URL("beta/completions", this.apiBase); | |
| const endpoint = new URL( | |
| this.apiBase.endsWith("/beta/") ? "completions" : "beta/completions", | |
| this.apiBase, | |
| ); |
There was a problem hiding this comment.
"/beta/completion" is an unique endpoint of the DeepSeek FIM API. If you want to remove it for custom deployments, an apiBase provider check could be added
const apiBaseBeta = apiBase.contains("api.deepseek.com") ? apiBase + "beta/" : apiBase; // or even "beta/completion" to allow custom "fim" paths
…
new URL( apiBaseBeta, …
There was a problem hiding this comment.
1 issue found across 2 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="core/llm/countTokens.ts">
<violation number="1" location="core/llm/countTokens.ts:464">
P1: Prefill token budgeting double-counts `msgsCopy` as both non-negotiable and prunable history, causing incorrect pruning and possible premature context overflow errors.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
- Fix double-counting of tokens in prefill scenarios that caused incorrect pruning - Treat last assistant message as non-negotiable instead of entire conversation - Prevents 'Error parsing chat history: no user/tool message found' during edits - Update DeepSeek token multiplier from 1.00 to 1.05 Resolves issue where edit requests with DeepSeek and other prefill-enabled models would fail due to incorrect token budgeting in compileChatMessages.
|
@cubic-dev-ai something went wrong, re-run a review. |
@dixoxib I have started the AI code review. It will take a few minutes to complete. |
Description
Enhanced DeepSeek API adapter from a minimal 73‑line implementation to a complete, production‑ready adapter with all features expected from a modern LLM provider integration following official docs.
Key improvements:
/modelsendpointchat()(function seems not fully implemented yet)Files changed: 39 files, including core implementation, tests, documentation, and configuration updates, small fixes.
AI Code Review
@continue-reviewChecklist
Screen recording or screenshot
Key demonstrations:
reasoning_contentin responsessv.mp4
Tests
Added/updated tests:
packages/openai-adapters/src/test/deepseek-converters.test.ts– Comprehensive unit tests for converter functionscore/llm/llms/DeepSeek.unit.test.ts– Unit tests for DeepSeek provider classcore/llm/llms/DeepSeek.vitest.ts– Integration testspackages/openai-adapters/src/test/DeepSeekApi.test.ts– API adapter testcore/llm/llms/DeepSeek.tools.test.ts– Thinking tool chain testThe enhancement addresses limitations in the current minimal implementation and enables full DeepSeek functionality including Agent mode with tool calling for Reasoner model.
I have read the CLA Document and I hereby sign the CLA
Continue Tasks: 🔄 7 running — View all
Summary by cubic
Complete DeepSeek provider with chat, reasoning, tools, and FIM (beta) support, plus onboarding/docs and UI updates. Improves assistant-only streaming, usage/cost reporting (cache hit/miss), token counting with a prefill assistant-last fix and 1.05 multiplier, and preserves
apiBasewith correct FIM/betaendpoint handling.New Features
reasoning_content, thinking→tool call pairing, tool choice, model autodetect, and OS‑models edit template.deepseek-fim-betaonly;supportsFimtrue for FIM beta; FIM uses/beta/completionsat request time (no/betainapiBase).maxTokensfor DeepSeek, 1.05 token multiplier, improved pruning, streamed usage stats, DeepSeek pricing with cache hit/miss; fixed token counting in prefill (assistant‑last) edits to prevent errors.apiBasepreserved exactly and defaulted; addedllm-infoentries, onboarding/UI and VS Code updates;packages/openai-adaptersDeepSeek adapter, converters, and tests.Migration
provider: deepseekwith models:deepseek-chat,deepseek-reasoner,deepseek-fim-beta(for FIM).apiBasedefaults tohttps://api.deepseek.com/; do not include/beta(FIM endpoint adds it automatically). SetmaxTokens(chat: 8k, reasoner: 32k).deepseek-coderwith the newDeepSeekclass and models; provider ID remainsdeepseek.Written for commit 129334f. Summary will update on new commits.