AI Codex
Tools & Ecosystem ClaudeDevelopersOperators

Prompt Caching

Also: context caching

Prompt caching saves a portion of your prompt — typically a large system prompt or document — so Claude doesn't need to re-process it on every request. The first call includes the full context; subsequent calls reference the cached version. The practical effect: faster responses and lower API costs when you're repeatedly sending the same long context, which is common in production applications.

Articles