Barry Adams on SEO, AI and the future of news publishing

This week, we have an interview with one of the most respected voices in technical SEO and digital strategy, Barry Adams

As the founder of Polemic Digital, he works with leading publishers and global brands to improve search visibility, audience growth and long-term performance. 

With a background that spans both development and marketing, Barry is known for cutting through SEO myths and focusing on what genuinely works. 

He’s a regular conference speaker and trusted industry commentator, he brings practical insight, sharp analysis and a refreshingly honest perspective on how search really works.

In this issue, he brings all that and more to talking about AI. 

We’re always looking for ways to improve, so if there’s something you want to see here you can reply to this email or get in touch on LinkedIn. 

Thanks for reading, 

Beth Ashton

Chief Growth Officer, Bright Sites

Let’s start with the big question… is SEO dead?
Barry Adams: “No. Next question.

To give you a slightly longer answer: no. Search is still by far the largest driver of traffic to the web. The total amount of clicks Google sends isn’t actually decreasing, but it is flattening.

We’ve become used to Google usage constantly growing and sending more and more clicks to websites. Even if you were doing a mediocre job, you’d often still see growth simply because more people were searching more often. That growth has now flattened.

Part of that is because Google has become very good at keeping people on the search results page with AI overviews, featured snippets, direct answers, and other elements that don’t require a click-through anymore.

User behaviour is also fragmenting. People don’t just rely on Google, they go to ChatGPT, TikTok, and other platforms to get information. We’ve reached and passed peak search.

That means you now have to work harder to get your slice of the pie. Search traffic may be stable overall, but for individual websites it’s much more competitive. SEO isn’t dead, it’s just harder.”

How does traditional SEO relate to optimisation for ChatGPT and other LLMs?
BA: “They’re about 99.9% identical. All the classic SEO principles still apply.

Large language models crawl the web for training data and when you use them, they often perform Google searches in the background. So if you rank well in Google, you’re much more likely to be surfaced in ChatGPT or other LLMs.

LLMs may currently be more susceptible to spam and volume-based signals – if you’re mentioned frequently across many websites, you’re more likely to be cited. But a lot of what AI optimisation agencies are promoting right now is just spam: churning out huge volumes of low-quality content on weak websites.

What’s ironic is that LLMs will have to reinvent the same quality and authority signals that Google has been refining for 25 years. Google has a massive advantage here because it already has those systems in place.

Anyone claiming LLM optimisation is a completely different discipline is simply wrong.”

What’s your view on Google’s direction with AI Mode and AI summaries?
BA: “I think that Google’s primary motivation right now is not to lose the search market to ChatGPT. They’ll do whatever they need to preserve their dominance.

That’s why AI Mode is becoming a default interface, so people think, ‘I don’t need to go anywhere else, I can just use Google.’ AI Mode is less monetisable and Google will lose money on it in the short term, but they have deep pockets and a lot of patience.

They’re willing to absorb losses for years if it means outcompeting OpenAI. Sending traffic to publishers is a secondary concern. Their main focus is protecting their core business.”

Do you think Google actually cares about publishers?
BA: “There’s a massive scale issue here. There are roughly 1.2 billion websites. Google doesn’t need that many.

They don’t need a billion websites, they need maybe 50 to 100 million. A handful of authoritative sites per topic is enough to create the illusion of choice.

If Google’s AI features cause large numbers of websites to disappear, that’s not really a problem for Google. In fact, it reduces crawling and indexing costs. From Google’s perspective, that’s a net benefit.”


Is Google Discover essentially breadcrumbs thrown to publishers?
BA: “Probably a bit of both. Discover is genuinely useful for users as it’s personalised and many people rely on it for daily news.

Google doesn’t make much money from Discover but users like it. At the same time, publishers have become heavily dependent on it for traffic. 

Google doesn’t need Discover. Google could kill Discover tomorrow and they wouldn’t lose anything. So they’re keeping it up. They’re keeping it running and they’re keeping it relevant. Because it helps publishers survive. But the volatility of Discover and the fact that it is such a wild west out there and that it encourages really bad habits in terms of clickbait headlines makes it a very dangerous platform for publishers to be dependent on. 

But Discover is incredibly volatile. When 70–80% of your Google traffic comes from Discover, you’re building on quicksand. One algorithm update can wipe that traffic out overnight.

Discover should be treated as bonus traffic, not the foundation of your strategy. I’ve seen publishers lose almost all their Discover traffic while their search traffic remained stable.”

So should publishers just block the Google bots?

BA: “I think because search still drives so much traffic I think Publishers in general are not inclined to block Googlebots, period. Google has given us tools to block specific aspects of the AI mode and AI overviews. 

Of course, Googlebots will need access to your website if you want to get Discover traffic but especially for news publishers top stories traffic is so important as well 

I don’t think a publisher can seriously consider blocking Googlebots from crawling their website.

Do you know of any publishers using no-snippet (to prevent descriptive text (snippets) from appearing in search results for a webpage) in any sort of meaningful way?

BA: It’s starting to come up. I see some paywalled publishers also put no snippet HTML attributes where their paywall starts so that’s an interesting use case for no snippet to bolster your paywall while not directly affecting your search traffic. 

I’m not sure a lot of publishers understand how to use no-snippet, because there’s multiple different ways you can implement it. But as publishers get more savvy about how they should optimise for LLMs and how not to optimise for LLMs especially Google, I think they’ll adopt it more widely.

Should publishers to use AI for optimisation?

BA: “Yes, and this is where AI really shines. AI is great at handling tedious, repetitive tasks: generating alt text, optimising captions, suggesting internal links, creating social headlines, and surfacing archive content.

That saves journalists time and mental energy.

But journalism itself should never be automated. AI doesn’t understand anything, it doesn’t have knowledge or meaning. It’s just a very advanced word-prediction system. Use AI around journalism, not instead of it.”

“If it’s good for users, it’s usually good for SEO.

Summaries, bullet points, better structure – these improve engagement. And we know Google uses engagement signals as part of its ranking systems. If AI helps readers consume content more easily, that can absolutely help visibility in both Search and Discover.”


From an SEO perspective, what should newsrooms focus on now?
BA: “For many established publishers, SEO is no longer a growth channel, it’s a retention channel. You have to work harder just to maintain what you already have.

Search is also increasingly a multimedia landscape. If you’re only producing text, you’re missing opportunities. Turn articles into podcasts, videos, shorts, social clips – audiences want to consume content in multiple formats.

That also means journalists may need to step into more public-facing roles and put themselves in the spotlight a little, which isn’t comfortable for everyone. But having recognisable personalities matters to drive the podcast or the video or the newsletter even. 

Lastly, you have to build a loyal audience. The only way to become immune to whatever Silicon Valley dreams up next is to have readers who actively seek you out. 

Are you optimistic about the future of journalism in an AI-driven world?
BA: “I honestly think in five years, publishing will be healthier but there will be fewer news websites. There will be casualties and I’m okay with that because there have been too many low-quality, fly-by-night sites. Some will disappear, and that’s okay. We won’t go back to 2015 or 2016, the ecosystem will be different and AI will play a big role in how people interact with the web. 

People will still want to read the news. After this AI hype cycle settles, we’ll have fewer but stronger news publishers who are better equipped to survive the next disruption.I’m optimistic. I think the industry as a whole will be fine. Some individual websites, though, are going to have a very hard few years.”

What is Storycue?

Most newsrooms don’t struggle to come up with ideas – they struggle to decide what matters right now.You’ve got trends in one place, performance data in another and a lot of judgement calls in between. Storycue brings those together and helps teams decide what to publish, what to update and when to act – using their own data.

Sign up to get interviews like this directly to your inbox as soon as they’re published: