Replies: 1 comment 1 reply
-
Hi @TIAcode, |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Great plugin!
I think a lot of users of this often have quite a lot of context and often ask follow up questions, so some savings could be had from using Prompt Caching (where cached message cost like 10% per token). Also it looks like the
ephemeral
caching doesn't really need that much / any state saved (I could be wrong), though obviously it would be nice if we could have some big context prompts cached over sessions, but I'm not really sure how that would work UI-wise.Beta Was this translation helpful? Give feedback.
All reactions