Data Extraction
The examples teach the model which fields to extract and how to normalize unstructured data into a consistent format, even when the source emails are written very differently.
Extract the key details from each email and return them in a consistent format.
Example 1:
Email: "Hi, I'd like to book a meeting room for 6 people on March 15th from 2-4pm. We need a projector. Thanks, Sarah"
Extracted:
- Requester: Sarah
- Date: March 15
- Time: 2:00 PM - 4:00 PM
- Attendees: 6
- Requirements: Projector
Example 2:
Email: "Can you reserve the large conference room next Tuesday morning for our quarterly review? About 12 people. Video conferencing setup needed. - Mike"
Extracted:
- Requester: Mike
- Date: Next Tuesday
- Time: Morning
- Attendees: 12
- Requirements: Video conferencing
Now extract from this email:
Email: "{{email_text}}"
Extracted:Variables to customize
Why this prompt works
The examples teach the model which fields to extract and how to normalize unstructured data into a consistent format, even when the source emails are written very differently.
Save this prompt to your library
Organize, version, and access your best prompts across ChatGPT, Claude, and Cursor.
Related prompts
Requesting confidence and key phrases forces the model to justify its classification rather than guessing. The structured output format works zero-shot because sentiment analysis is well-understood by LLMs.
Key Information ExtractionListing the exact fields to extract removes guesswork. The 'Not specified' instruction prevents hallucination when information is missing -- a common failure mode without this guardrail.
Question Answering with SourceGrounding the answer in source material and instructing the model to refuse when information is missing dramatically reduces hallucination -- the biggest risk in zero-shot Q&A.
Math Word Problem ReasoningExplicit numbered steps force the model to decompose the problem rather than guessing. The verification step catches arithmetic errors before the final answer.