Translation with Context
Adding context, formality level, and audience prevents the model from choosing the wrong register. A medical document needs different vocabulary than a marketing email, even in the same language pair.
Translate the following text from {{source_language}} to {{target_language}}. Context: This text is from {{context}} and should use {{formality}} language appropriate for {{audience}}. Text: {{text}} Translation:
Variables to customize
Why this prompt works
Adding context, formality level, and audience prevents the model from choosing the wrong register. A medical document needs different vocabulary than a marketing email, even in the same language pair.
Save this prompt to your library
Organize, version, and access your best prompts across ChatGPT, Claude, and Cursor.
Related prompts
Requesting confidence and key phrases forces the model to justify its classification rather than guessing. The structured output format works zero-shot because sentiment analysis is well-understood by LLMs.
Key Information ExtractionListing the exact fields to extract removes guesswork. The 'Not specified' instruction prevents hallucination when information is missing -- a common failure mode without this guardrail.
Question Answering with SourceGrounding the answer in source material and instructing the model to refuse when information is missing dramatically reduces hallucination -- the biggest risk in zero-shot Q&A.
Math Word Problem ReasoningExplicit numbered steps force the model to decompose the problem rather than guessing. The verification step catches arithmetic errors before the final answer.