How to Use AI for Cold Email (Most People Have It Backwards).

AI cold emails fail because reps use AI to write, not to research. The real leverage is using AI for prospect intelligence first, then letting that drive the email.

How to Use AI for Cold Email (Most People Have It Backwards)

In 1953, David Ogilvy sat down to write an advertisement for Rolls-Royce. He had been given the job weeks earlier. He spent most of those weeks reading Rolls-Royce engineering reports, not writing.

When he finally sat down to draft, the headline came from a single sentence buried in a technical document: at 60 miles an hour, the loudest noise in the new Rolls-Royce comes from the electric clock. That headline ran for decades. It remains one of the most effective pieces of automotive advertising ever written.

Ogilvy’s rule was simple. “The more you know about the product,” he wrote, “the more likely you are to come up with a big idea for selling it.” He spent ten times as long researching as writing. That ratio, he believed, was why most advertising failed. Not because the writers were bad. Because they started writing too soon.

The same mistake is being made right now by thousands of sales reps using AI to write cold emails.

The wrong question

The most common AI cold email workflow looks like this: open ChatGPT, type “write me a cold email to [prospect role] at [company type] about [product],” review the output, maybe tweak a sentence, send. The email is grammatically correct. It mentions pain points. It has a call to action. It sounds exactly like every other AI-generated cold email in the prospect’s inbox — which is to say, it sounds like nothing.

What’s interesting is that this isn’t a problem with AI. It’s a problem with where in the process AI gets asked to help.

Most reps use AI to do the part of cold email they find tedious: writing. But writing is not actually the hard part of effective cold email. Research is. Identifying a specific reason this particular person might care about what you’re selling — that’s where the work is. That’s the part most reps skip. That’s the part they’re still skipping when they use AI.

Ogilvy’s insight applies directly: the quality of the output is bounded by the quality of the input. A well-researched prompt produces a useful draft. A generic prompt produces a generic email. AI is a transformer, not an oracle. It can only work with what you give it.

Why personalization is the signal

There’s a reason personalization in cold email works that goes deeper than tactics.

When a cold email references something specific — a talk you gave last month, a job posting that signals a strategic shift, a recent customer win mentioned on your company LinkedIn — the reader’s brain registers it as evidence of effort. Effort signals interest. Interest signals that you might actually have something relevant to say. The reader gives you the next sentence.

This is not a trick. It’s how human attention works. We ignore messages that feel broadcast. We engage with messages that feel addressed to us specifically.

Joe Girard understood this before email existed. He sold 13,001 cars between 1963 and 1978 — still the Guinness World Record for retail sales (Guinness World Records). His system was methodical: index cards with every customer’s name, birthday, family details, car history. He sent personalized cards every month. Not form letters with names swapped. Cards that referenced specific things about specific people. Every month. To thousands of customers simultaneously.

What Girard automated was the logistics — the mailing system, the card printing, the tracking. What he never automated was the signal: that he knew who you were.

That’s the distinction that matters. Automate logistics. Never automate the signal.

Most AI cold email is automating the signal. That’s why it doesn’t work.

The research-first workflow

The workflow that actually produces results is the inverse of what most reps do.

AI for research first. Human judgment for the opening. AI assistance for the rest.

Here’s what that looks like in practice.

Step 1: Use AI to surface personalization hooks.

Before you write anything, give AI this prompt:

“I’m going to send a cold email to [name], [title] at [company]. Based on the following information — [paste their LinkedIn headline, a recent post or comment, any company news, recent job postings] — what is one specific challenge or priority this person is likely focused on right now that someone selling [your product/service] might address?”

This prompt takes 90 seconds to run. The output gives you the research insight Ogilvy spent weeks finding. It won’t always be right. But it gives you a specific thread to pull on, which is more than most reps have when they start writing.

Step 2: Write the opening yourself.

Take the AI’s research output and write the first one or two sentences yourself, in your own voice. This sentence should reference something specific to this person and connect it — briefly, lightly — to why you’re reaching out. Do not ask AI to write this sentence. The opening is where the signal is. It needs to sound like a human who did homework, not a template that swapped a name.

Here’s the difference in practice:

Generic opener (AI writing without research): “I’m reaching out because I noticed [Company] is growing fast and I thought you might be interested in how we help sales teams hit quota.”

Research-based opener (human writing from AI research): “I saw you recently posted about building out your SDR team after the Series B — that stage of scaling outbound is usually where the handoff between marketing leads and cold outbound gets messy.”

The second sentence earns the next sentence. The first one doesn’t.

Step 3: Let AI draft the body and CTA.

Once the opener is done — the one sentence that proves you did your homework — the rest of the email can follow a template. Your value prop, your differentiator, your ask. These don’t need to be personalized. They need to be clear and brief. AI is good at this. Give it your opener and ask it to complete the email in under 80 words.

Step 4: Test subject lines with AI.

Subject lines are worth testing and AI can help you run variations quickly. Give AI your email and ask for 5 subject line variations: one direct (names what you do), one curiosity-based (references the specific hook), one social proof (names a relevant customer), one ultra-short (3 words), one question. Test across your sequence. Most email tools give you open rate data at the individual subject line level.

What’s interesting is that the subject line is usually not the bottleneck reps think it is. If your open rates are low, the problem is often list quality — wrong prospects — not copy. If reply rates are low, that’s a copy problem. These are different diagnoses with different fixes.

Making this repeatable

If you’re building out a repeatable outbound motion, the research workflow above pairs well with a structured AI sales playbook — the research prompts become part of the playbook, not a one-off step. When you get to objection handling, AI tools for sales objection handling can help you prepare responses to the objections your research workflow surfaces. And if competitive positioning is showing up in cold email responses, AI battlecard tools can give your team live competitive context without having to research it manually for each call.

Try this today

Pick three prospects you’ve been meaning to reach out to. For each one, spend five minutes doing this before you write a single word of the email:

  1. Open their LinkedIn profile. Read their last three posts or comments. Note anything that signals a current priority, a recent change, or a viewpoint they’ve expressed publicly.

  2. Run this prompt: “Based on [what you found], what’s one specific challenge this person is probably dealing with right now that someone selling [your product] might help with? Give me one sentence.”

  3. Write your opening sentence using that output. Don’t copy it directly. Use it as the research. Write the sentence yourself.

Then finish the email. Compare the three you just wrote with the last three you sent using AI to write the whole thing. The difference is visible.

The insight is not that AI makes cold email better. It’s that AI, used correctly, makes the research that makes cold email better faster. The email is still yours. It just comes from a foundation of actual knowledge about the person you’re addressing.

That’s what Ogilvy figured out in 1953. The idea comes from the research. Always.

FAQ.

Does AI actually help with cold email open rates?

AI typically hurts open rates when used to write emails — the output is generic and prospects recognize it immediately. It can improve rates when used for research first, because research produces specific hooks that earn attention. The lever is where in the workflow you apply it. Writing is the last step, not the first.

What's the best AI prompt for cold email?

The most useful prompt isn't for the email itself. It's for research: 'Based on [prospect's LinkedIn headline, recent post, company news], what is one specific challenge this person is likely facing right now that [your product] addresses?' Use that answer as the context for your opener. Then write the opener yourself.

How do I make AI-written cold emails sound less robotic?

Stop asking AI to write the email. Instead, use AI to gather research, then write the first two sentences yourself in your natural voice. AI can draft the rest — value prop, CTA — because those can follow a template. But the opener, the part that has to feel personal, has to come from a human using real information. That's the sentence that determines whether the rest gets read.

How long should a cold email be?

Under 100 words for the initial email. Prospects receive dozens of cold emails per week. A long email signals that you value your own words more than their time. The structure that works: one specific observation (from your research), one sentence connecting it to what you do, one low-friction ask. Nothing else.

Can AI help me personalize cold emails at scale?

Yes, but not by writing them at scale. The scalable part is research: AI can process LinkedIn profiles, company news, recent funding announcements, and job postings to surface personalization hooks for dozens of prospects in the time it used to take to research one. The writing still benefits from human judgment — but you can write 20 personalized openers in an hour when AI has done the research upfront.