b4bf73136a49805acead73cfa171f7dc6f3bc0ce
Adds an optional CacheHints field on provider.Request that carries cache-breakpoint placement directives from the public llm package down to individual provider implementations. Anthropic will consume these in a follow-up commit; OpenAI and Google ignore them. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Description
Abstraction layer interface for various similar LLM services
Languages
Go
100%