-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Labels
bugSomething isn't workingSomething isn't working
Description
Problem (one or two sentences)
Azure hosted GPT-5 returns OpenAI completion error: 400.
Context (who is affected and when)
Roo Code user using Azure hosted GPT
Reproduction steps
- VS Code extension with OpenAI Compatible API Provider
- Send a request in Roo Chat
Expected result
Proper GPT response
Actual result
OpenAI completion error: 400 litellm.UnsupportedParamsError: azure does not support parameters: ['tool_choice'], for model=responses/gpt-5-chat. To drop these, set litellm.drop_params=True or for proxy: litellm_settings: drop_params: true . If you want to use these params dynamically send allowed_openai_params=['tool_choice'] in your request.. Received Model Group=gpt-5-chat Available Model Group Fallbacks=None
Variations tried (optional)
Tried our 4.1 model, works fine there.
App Version
v3.39.3
API Provider (optional)
OpenAI Compatible
Model Used (optional)
GPT-5
Roo Code Task Links (optional)
No response
Relevant logs or errors (optional)
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working
Type
Projects
Status
Issue [Needs Info]