Skip to content

feat(anthropic): add support for prompt caching with fix #4100

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

Claudio-code
Copy link
Contributor

@Claudio-code Claudio-code commented Aug 10, 2025

Claudio-code and others added 6 commits April 8, 2025 11:35
Implements Anthropic's prompt caching feature to improve token efficiency.

- Adds cache control support in AnthropicApi and AnthropicChatModel
- Creates AnthropicCacheType enum with EPHEMERAL cache type
- Extends AbstractMessage and UserMessage to support cache parameters
- Updates Usage tracking to include cache-related token metrics
- Adds integration test to verify prompt caching functionality

This implementation follows Anthropic's prompt caching API (beta-2024-07-31) which allows
for more efficient token usage by caching frequently used prompts.
@sobychacko
Copy link
Contributor

Could you take care of the DCO signing on the commit? https://spring.io/blog/2025/01/06/hello-dco-goodbye-cla-simplifying-contributions-to-spring. Thanks!

@sobychacko
Copy link
Contributor

@Claudio-code Checking in again - could you please update your commit with DCO signing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants