How to Talk to AI (and Actually Get What You Want)

3 min read
August 18, 2025
Share:

Every time you type something into ChatGPT, Claude, or LLaMA, you’re basically giving instructions to an extremely smart intern. The trick is simple: how you ask decides what you get.

That’s where prompting styles come in. Think of them as different ways to guide AI so it behaves the way you need. Use the right one, and suddenly the AI shifts from “generic answers” to “useful, sharp, and tailored responses.”

Here’s a breakdown of the most important styles, with examples you can apply right away.

Prompting Formats

Different models sometimes expect different wrappers around your input. These are like templates for instructions.

Alpaca Style

plain text
### Instruction:
Summarize Harry Potter in one sentence.

### Response:
A boy wizard discovers magic, friendship, and battles evil.

ChatML (used by OpenAI)

plain text
<|user|>
Explain Kubernetes like I’m 12.

<|assistant|>
Imagine you have lots of toy boats...

INST Format (LLaMA-style)

plain text
[INST] Write a motivational gym quote [/INST]

Why this matters: these formats are great for clarity and consistency, especially when building applications on top of large language models.

Zero-Shot Prompting: Just Ask

Direct question, no examples.

plain text
Explain recursion in one sentence.

AI: “Recursion is when a function calls itself until it stops.”

Fast and straightforward. But for complex tasks, the answers can be vague.

Few-Shot Prompting: Show, Don’t Tell

Give examples first, then ask for the next case.

plain text
Translate English to Pirate:
Hello → Ahoy
Friend → Matey

Now translate: Goodbye →

AI: “Farewell, ye scallywag.”

Best when you want structured outputs — translations, regex, SQL queries, or standard templates.

Chain-of-Thought (CoT): Think Step by Step

Sometimes you need reasoning, not just an answer. Adding “think step by step” makes a huge difference.

Example (math):

plain text
Q: If I buy 3 pens at $2 each and pay with $10, how much change do I get?
Think step by step.

AI:

  • 3 pens Ă— $2 = $6
  • Paid $10
  • Change = $4

Answer: 4

The Strawberry Example:

plain text
Q: Emily’s mom has 3 daughters. The first is April, the second is May. What is the third daughter’s name?

Zero-shot AI answer: “June.”

Now with CoT:

  • The mother has 3 daughters.
  • Two names are April and May.
  • The question asks “What is the third daughter’s name?”
  • The trick is in the wording: it already says “Emily’s mom.”

Correct answer: Emily.

This is where CoT shines — it forces the AI to “reason through the trap” instead of jumping to the obvious but wrong guess.

Self-Consistency: Multiple Paths, One Answer

Ask the AI to explore several answers, then compare.

plain text
Solve this riddle. Give 3 possible answers and then decide which is most likely.

This reduces errors by letting the model cross-check its own thinking.

Persona Prompting: Assign a Role

Give the AI a perspective.

plain text
You are a senior developer who dislikes meetings. Explain daily standups.

AI: “Standups are fifteen minutes where everyone repeats yesterday’s work while secretly planning lunch.”

Perfect for creative writing, roleplay, or tailoring tone.

Self-Reflection: Review Its Own Work

Have the AI refine or correct itself.

plain text
Explain quantum computing simply.
Now review your answer and make it even simpler for a beginner.

This is surprisingly effective for code debugging, refining writing, or simplifying technical content.

GIGO: Garbage In, Garbage Out

All of these techniques rely on one golden rule: if your prompt is sloppy, your answer will be sloppy.

Example of bad prompt:

plain text
Tell me about history.

AI: (overwhelmed, vague, generic)

Example of good prompt:

plain text
In 3 bullet points, explain the major impacts of the printing press on European society.

Good prompts are clear, specific, and scoped. If you put in garbage (unclear, broad, or contradictory instructions), you’ll get garbage out. Prompting is not magic — it’s structured communication.

Where You See These in the Real World

You might think prompting is just for hobbyists, but most modern AI products are powered by these exact techniques:

  • Vercel v0 – Uses few-shot and persona prompting to generate production-ready UI components in React, Next.js, and Tailwind.
  • Notion AI – Uses zero-shot and chain-of-thought for summarization, writing assistance, and brainstorming.
  • GitHub Copilot – Relies heavily on few-shot prompting (examples of code → next line of code).
  • Perplexity AI – Uses chain-of-thought and self-consistency to provide step-by-step answers with sources.
  • Jasper AI – Uses persona prompting to write in brand voice for marketing content.

These aren’t just academic tricks. They are the foundation of real-world AI systems millions of people use every day.

Wrapping Up

Prompting is about choosing the right style for the right problem:

  • Zero-shot for quick answers.
  • Few-shot for patterns.
  • Chain-of-thought for reasoning.
  • Self-consistency for reliability.
  • Persona for tone and perspective.
  • Self-reflection for accuracy.
  • And remember GIGO: unclear prompts = unclear answers.

And when the model supports specific formats (Alpaca, ChatML, INST), use them to add structure and context.

Prompting well doesn’t mean writing long, complicated instructions. It means asking with clarity and purpose. Do that, and AI stops being a random text generator and starts becoming a reliable teammate.