Skip to content

AnthropicLlm.generate_content_async passes system=None to Anthropic API, causing 400 during event compaction #5318

@PhilippeMoussalli

Description

@PhilippeMoussalli

Describe the Bug:
AnthropicLlm.generate_content_async passes system=None to the Anthropic API when no system instruction is set. The Anthropic API rejects None — it
expects a str or a list of content blocks. This causes event compaction to crash every time it fires when using a Claude model.

Steps to Reproduce:

  1. Configure an ADK app with a Claude model (via Vertex AI) and EventsCompactionConfig
  2. Run enough user invocations to trigger compaction (e.g. compaction_interval=20)
  3. When compaction fires, LlmEventSummarizer creates an LlmRequest with no system instruction (config.system_instruction defaults to None)
  4. AnthropicLlm.generate_content_async passes system=None directly to the Anthropic API → 400 Bad Request

Expected Behavior:
When system_instruction is None, the system parameter should be omitted from the Anthropic API call (use NOT_GIVEN), not passed as null.

Observed Behavior:
anthropic.BadRequestError: Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'system: Input should be a valid
list'}}
Full traceback:
File ".../google/adk/apps/compaction.py", line 564, in _run_compaction_for_sliding_window
compaction_event = await config.summarizer.maybe_summarize_events(events=events_to_compact)
File ".../google/adk/apps/llm_event_summarizer.py", line 107, in maybe_summarize_events
async for llm_response in self._llm.generate_content_async(llm_request, stream=False):
File ".../google/adk/models/anthropic_llm.py", line 309, in generate_content_async
message = await self._anthropic_client.messages.create(
anthropic.BadRequestError: Error code: 400 - {'type': 'invalid_request_error', 'message': 'system: Input should be a valid list'}

Environment Details:

  • ADK Library Version: 1.26.0 (confirmed present in 1.30.0)
  • Desktop OS: macOS
  • Python Version: 3.13

Model Information:

  • Are you using LiteLLM: No
  • Which model: claude-sonnet-4-6 via Vertex AI (Claude subclass of AnthropicLlm)

Regression: N/A — unable to determine when introduced.

Logs: See stack trace above.

Minimal Reproduction Code:
from google.adk.apps import App, ResumabilityConfig
from google.adk.apps.app import EventsCompactionConfig
from google.adk.agents import LlmAgent

agent = LlmAgent(
name="my_agent",
model="claude-sonnet-4-6",
instruction="You are a helpful assistant.",
)

app = App(
name="my_app",
root_agent=agent,
resumability_config=ResumabilityConfig(is_resumable=True),
events_compaction_config=EventsCompactionConfig(
compaction_interval=5, # low value to trigger quickly
overlap_size=2,
),
)

Run 5+ user invocations — compaction fires and crashes with 400

Root cause: LlmEventSummarizer.maybe_summarize_events (llm_event_summarizer.py:102) creates an LlmRequest without setting config, so
config.system_instruction is None. AnthropicLlm.generate_content_async (anthropic_llm.py:311) passes this directly as system=None to the Anthropic SDK,
which serialises it as JSON null. The API rejects null because it expects str | list.

Suggested fix:

anthropic_llm.py

from anthropic import NOT_GIVEN

system = llm_request.config.system_instruction
if system is None:
system = NOT_GIVEN

message = await self._anthropic_client.messages.create(
model=llm_request.model,
system=system,
...
)

How often: Always (100%) — every time compaction fires with a Claude model.

Metadata

Metadata

Labels

models[Component] Issues related to model support

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions