-
Looking at changelogs for recent LibreChat versions, I've seen that prompt caching has been developed when calling Anthropic models directly via their API (and I can see there's a toggle for it in the UI) I was wondering if it's enabled when calling e.g. Sonnet 4 via bedrock ? This AWS docs page implies that maybe it's on by default if LibreChat is calling the InvokeModel API - I'm not sure if we use Converse or InvokeModel though. https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-caching.html |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
There's currently an open PR for that: #8271 |
Beta Was this translation helpful? Give feedback.
There's currently an open PR for that: #8271