Skip to content

feat(providers/anthropic): add MessageSerializationCache to avoid O(N²) message serialization#25

Draft
johnstcn wants to merge 1 commit intomainfrom
cian/message-serialization-cache
Draft

feat(providers/anthropic): add MessageSerializationCache to avoid O(N²) message serialization#25
johnstcn wants to merge 1 commit intomainfrom
cian/message-serialization-cache

Conversation

@johnstcn
Copy link
Copy Markdown
Member

Adds MessageSerializationCache support to the Anthropic provider so callers can avoid O(N²) message serialization across agentic loop steps.

  • provider_options.go: MessageSerializationCache interface, MessageCacheKey constant, GetMessageCache() extractor
  • anthropic.go: prepareParams extracts cache from call.ProviderOptions; toPrompt gains cache parameter + blockHasCacheControl tracking; new applySerialisationCache() handles hit/miss/skip per output MessageParam via param.SetJSON
  • anthropic_test.go: existing toPrompt call sites updated to pass nil (no behavior change)

Consumer PR: coder/coder#24407

🤖

…²) message serialization

Each agentic loop step re-serializes the entire message history via
MarshalJSON. This adds a MessageSerializationCache interface that
callers can pass through ProviderOptions. When present, toPrompt
caches each output MessageParam's serialized JSON and applies
param.SetJSON on subsequent steps to short-circuit MarshalJSON.

Messages with cache_control breakpoints are excluded since their
annotations change between steps.
@johnstcn johnstcn self-assigned this Apr 15, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant