Long Context Summarization
Leverages Gemini's 1M+ token context window for full-document analysis. The three-tier structure lets you skim or deep-dive. Tagging claims vs facts aids critical reading.
I'm providing a {{document_type}} that is approximately {{page_count}} pages long. Summarize it at three levels of detail: **Level 1 — Executive Summary (3 sentences max)** The single most important takeaway, who should care, and what action to take. **Level 2 — Section-by-Section (one paragraph each)** For each major section or chapter, provide: key argument, supporting evidence, and how it connects to the overall thesis. **Level 3 — Detailed Notes** Bullet points of every specific claim, data point, quote, or recommendation worth remembering. Tag each bullet as [Fact], [Claim], [Quote], or [Recommendation]. Also flag: - Any contradictions within the document - Claims that lack supporting evidence - The strongest and weakest arguments Maintain the document's original terminology — do not paraphrase technical terms.
Variables to customize
Why this prompt works
Leverages Gemini's 1M+ token context window for full-document analysis. The three-tier structure lets you skim or deep-dive. Tagging claims vs facts aids critical reading.
Save this prompt to your library
Organize, version, and access your best prompts across ChatGPT, Claude, and Cursor.
Related prompts
Forcing the agent to plan before acting prevents premature execution and wasted steps. Explicit dependency mapping enables parallel execution and catches logical gaps early.
Tool Selection AgentThe ReAct pattern (Reason + Act) creates an explicit reasoning trace that improves tool selection accuracy. The error-handling rule prevents infinite retry loops.
Prompt CompressorExplicitly requiring all functional requirements to be preserved prevents the model from over-compressing and losing critical instructions.
Memory Management AgentExplicit memory read/write instructions create agents that improve over time. Categorization keeps memories organized, and the deduplication rule prevents context bloat.