Open-Source Model Optimization Prompt
DeepSeek's open-source nature means many users run it locally. This prompt works well because DeepSeek understands its own architecture. Asking for copy-paste config values instead of general advice produces actionable output for the specific hardware setup.
I'm running DeepSeek {{modelVariant}} locally with the following setup: **Infrastructure:** - Hardware: {{hardware}} - Quantization: {{quantization}} - Context length: {{contextLength}} - Inference framework: {{inferenceFramework}} **Problem:** {{performanceProblem}} **Current configuration:** {{currentConfig}} Analyze my setup and recommend optimizations: 1. **Memory optimization** — How to reduce VRAM usage without significant quality loss 2. **Speed optimization** — Inference latency improvements 3. **Quality optimization** — Sampling parameters for my use case: {{useCase}} 4. **Configuration changes** — Specific settings to adjust with before/after expected impact Provide exact config values I can copy-paste, not general advice.
Variables to customize
Why this prompt works
DeepSeek's open-source nature means many users run it locally. This prompt works well because DeepSeek understands its own architecture. Asking for copy-paste config values instead of general advice produces actionable output for the specific hardware setup.
What you get when you save this prompt
Your workspace unlocks powerful tools to iterate and improve.
AI Optimization
One-click improvement with structure analysis and pattern suggestions.
Version History
Track every edit. Compare versions side-by-side with word-level diffs.
Folders & Tags
Organize your library with nested folders, tags, and drag-and-drop.
$ npm i -g @promptingbox/mcpUse Everywhere
Access prompts from Claude, Cursor, ChatGPT & more via MCP integration.
Your prompts, organized
Save, version, and access your best prompts across ChatGPT, Claude, Cursor, and more.