This is embarrassing, but I'm going to tell you anyway.
Last month, I was doing some research for a client presentation. I asked ChatGPT: "What are the best tools for SEO auditing?"
ChatGPT gave me a nice answer, and then it cited three sources. One of them was a tool I'd never heard of. One was a direct competitor. And the third? Also a competitor.
My tool wasn't mentioned at all.
Here's the kicker: I checked. All three of those sites ranked lower than mine for that exact query. One of them wasn't even on the first page of Google results.
So I did what any reasonable person would do: I spent the next three hours trying to figure out why.
The Investigation
I pulled up each of those competitor sites and started reading. And I noticed something immediately: their content was structured differently.
My site had the typical SEO-optimized structure: keyword-rich headings, meta descriptions, all that stuff. Their sites? They just answered questions. Directly. Clearly. Like they were talking to a person.
One of them had a section that literally started with "Here's how SEO auditing actually works:" and then just... explained it. No fluff. No keyword stuffing. Just a clear explanation.
I realized: ChatGPT wasn't looking at rankings. It was reading the content and deciding which one actually answered the question best.
What I Found
I did a deeper dive, and here's what those sites had that mine didn't:
- Clear structure: Their headings actually described what was in each section
- Direct answers: They answered questions upfront, not buried in paragraphs
- Schema markup: They had proper structured data that helped AI understand the content
- Comprehensive coverage: They didn't just touch on topics – they covered them thoroughly
- Recent updates: Their content was fresh, with dates and "last updated" notices
My content? It was optimized for humans to scan and for search engines to index. But it wasn't optimized for AI to understand and cite.
The Fix
So I rewrote my main pages. Not to rank better, but to be clearer. More direct. More useful.
I added schema markup. I restructured my content with clear headings. I answered questions directly instead of dancing around them. I updated my "last updated" dates. I made sure every section actually explained something instead of just mentioning keywords.
It took me a week. And honestly? It was some of the best writing I'd done in years. Because I wasn't trying to game anything – I was just trying to be helpful.
The Result
Two weeks later, I asked ChatGPT the same question. This time, my site was the second citation.
Not first, but second. And honestly? I'll take it. Because I know that when ChatGPT cites my site, it's because my content actually answers the question, not because I tricked an algorithm.
More importantly, when people do click through (and they do), they're finding content that actually helps them. My bounce rate dropped. My time on page went up. People are reading more of my content.
The Lesson
Here's what I learned: ranking high in Google doesn't mean AI will cite you. And getting cited by AI doesn't require ranking high in Google.
What it requires is good content. Clear content. Content that actually answers questions.
It's that simple. And that hard.
Because writing good content is harder than optimizing for keywords. It requires actually understanding your topic. It requires being helpful instead of just being visible.
But here's the thing: it's also more sustainable. Because when you write for clarity and usefulness, you're writing for both humans and AI. And that's the future of search.
Try It Yourself
Go ahead. Ask ChatGPT or Perplexity a question about your industry. See who it cites. Then go read those sites. I bet you'll notice the same things I did.
And if your site isn't getting cited? Don't panic. Just make your content better. Clearer. More useful.
That's it. That's the whole strategy.
Now if you'll excuse me, I have some content to rewrite.
Marcus Rodriguez
January 11, 2026