Fork of anomalyco/opencode with patches for LiteLLM proxy compatibility.
Problem: When using Gemini models through LiteLLM proxy (e.g., Letsur Gateway), the agent loop terminates after the first response instead of continuing with tool calls.
Cause: LiteLLM returns finish_reason: "stop" instead of "tool_calls" in streaming mode. (LiteLLM #12240)
Fix: Detect tool calls by checking message parts, regardless of finish_reason.
git clone git@github.com:seilk/opencode.git opencode-custom
cd opencode-custom
./update-and-build.shThis will:
- Fetch the latest tag from upstream
- Skip if already on the latest version
- Create a patched branch
fix-gemini-finish-reason-vX.X.X - Apply the patch
- Build and install to
~/.opencode/bin/ - Auto-commit all changes to the new branch
Add to PATH:
export PATH="$HOME/.opencode/bin:$PATH"update-and-build.sh- Build scriptpatches/fix-gemini-finish-reason.patch- The patch file
main- Scripts and patches only (this branch)fix-gemini-finish-reason-vX.X.X- Patched builds based on upstream tags