Use cases/Context Compaction

Context Compaction

How growing conversations are compressed to stay within context limits

When a conversation outgrows the context window, Claude Code runs a multi-stage compaction pipeline: proactively preserving important information, pruning old tool results, and generating structured summaries that retain user intent, decisions, and progress while dramatically reducing token count.

6 stepscontextcompactionlong-sessions

Step-by-step breakdown

1
⚙️Preserve key info early
Summarize Tool Results Reminder
Hide

Before tool results are pruned, the model is instructed to write down critical information proactively—preventing knowledge loss when older results are cleared.

write down any important information you might need later in your response
Techniques
behavioral-constraintsmeta-promptingcontext-injection
2
⚙️Automatic result pruning
Function Result Clearing (FRC)
Show
3
🔌Full conversation summary
Compaction: Base Prompt
Show
4
🔌Structured analysis phase
Compaction: Detailed Analysis (Base)
Show
5
🔌Incremental partial compaction
Compaction: Partial Window Prompt
Show
6
🔌Text-only constraint
Compaction: No-Tools Preamble
Show