The Biggest Mistakes News Publishers Make in SEO- Interview with Barry Adams

Table of Contents

About Barry Adams

Barry Adams is one of the most trusted voices in SEO for publishers, with over two decades of hands-on experience helping global news organisations grow their visibility in search. As the founder of Polemic Digital, Barry specialises in all aspects of SEO for news publishers, focusing on Google News, Discover, and Top Stories optimisation. He’s worked with major brands like The Guardian, The Sun, and FOX, and is known for cutting through SEO myths with sharp technical insights, practical editorial strategies, and a passion for making news content more accessible and visible across platforms. With Google constantly shifting how it treats news results, Barry remains at the forefront of helping publishers adapt, survive, and win.

Interview — The Biggest Mistakes News Publishers Make in SEO

Despite the enormous volume of content news publishers produce, many still struggle to capture the full potential of organic search. In this interview, Barry Adams breaks down the most common SEO mistakes made by newsrooms and how to fix them.

Q: Barry, you’ve worked with top-tier publishers around the world. In your experience, what are the most common SEO mistakes news publishers still make in 2026?

A: Too many publishers chase after clicks. It’s a fundamental issue, an inevitable result of the click-driven ecosystem that Google has helped create, and one that is extremely dangerous for publishers in the long term.

When it comes to optimising for search, many publishers believe the main goal is to chase after clicks and algorithms. They see SEO’s remit as the creation of content aimed purely at achieving top rankings in search and/or visibility in Discover, regardless of any actual journalistic value the content may have. Often, such an approach actually delivers traffic in the short to medium term. The click-chasing, algorithm-focused content will rank well and drive huge amounts of traffic.

But there is a powerful risk associated with this approach: Google tends to frown on websites that engage in click-chasing, and such sites are often on the receiving end of Google’s regular core algorithm updates in a very negative way.

Publishers who focus on the short-term gains they can make by crafting content specifically for SEO (rather than having a journalism-first focus) are leaving themselves extremely vulnerable to long-term pain when a Google update inevitably devastates their rankings and visibility.
Barry Adams Quote

Q: Many publishers assume that publishing volume alone drives traffic. Why is that a dangerous mindset?

A: The focus on volume follows from the aforementioned obsession with clicks and traffic. Publishing more articles gives a website more opportunities to rank in Google and/or be shown in users’ Discover feeds. So many publishers will choose volume, often to the detriment of quality, to ensure they hit their traffic targets.

In my experience, volume can help a publisher gain traffic, but it needs to be with a continuous focus on quality. Publishing too many low-effort, low-quality articles will lead to a problematic buildup of negative ranking signals, which will manifest in a downgrade of the website’s visibility in a future Google algorithm update.

It’s difficult to scale content production without losing quality, but it is crucial for publishers to maintain high journalistic standards, regardless of how many articles they publish daily.

Q: What technical SEO issues do you see most often on news sites and which ones have the biggest impact on visibility?

A: There are a few common issues that publishers face, which I come across on almost every site I audit. A very common one is incorrect image sizes and aspect ratios for the featured images defined in an article’s structured data, which are important for maximum visibility in Top Stories carousels and the Google Discover feed.

Pagination is a challenge that few publishers have tackled properly, finding the right balance to allow deep crawling of older content without creating massive crawl overhead for Googlebot.

Core web vitals are a constant struggle, especially for advertising-funded publishers, and there are usually no easy solutions for that.

Tag pages and topic hubs, and how to best implement them, are a recurring challenge that not many publishers are able to adequately solve and maintain.

Large scale publishers also need to take extra care to facilitate rapid crawling and indexing of their news articles. Fast-responding hosting infrastructures, serving the most appropriate HTTP status codes, minimising crawl waste, and including semantic HTML in your article templates are all areas where few publishers are fully optimised.

Q: How important is proper article structure (e.g., headlines, timestamps, author bylines) for Google News and Top Stories?

A: It’s absolutely critical. Google extracts the H1 headline from an article and will almost always use that as the visible headline when an article is shown in Top Stories and Google News. This contradicts classic SEO approaches, where the <title> tag is seen as most important.

Accurate timestamps are also critical, so that Google will show the correct timestamp with the article. Having multiple timestamps on an article page sometimes leads to Google showing the incorrect one.

And named bylines that link to dedicated author pages are a key component of EEAT signals. Publishers should be as transparent and open as possible, allowing for Google – and readers – to determine the accuracy of their reporting. 

Engagement signals also play an outsized role in Google’s news ecosystem versus ‘classic’ search results. Publisher sites need to have fast-loading and engaging articles, so that users will spend longer on the site and consume multiple articles per visit. These signals all filter through into Google’s core ranking algorithms through various metrics.

Q: Some newsrooms still don’t fully optimise their site for fast crawling and indexing. What are the non-negotiables every publisher should implement?

A: For me, the key components for fast crawling and indexing are the following:

  • Fast server response time: A publishing site should aim to respond to any Googlebot crawl request within 500 milliseconds or less. If a site is slower than that, Googlebot won’t be able to crawl the site as efficiently as it needs to.
  • Minimal crawl waste: Most URLs on a site need to serve a 200 OK HTTP status code, and need to be indexable self-canonicalised pages. When a site suffers from large scale crawl waste issues, it results in Google spending too much crawl effort on unindexable URLs and not enough on actual indexable and rankable content.
  • Article structured data: Proper implementation of article structured data helps Google to quickly index the article and show it in the fast-moving news ecosystem. Google supports Article, NewsArticle, and BlogPosting structured data for standard news articles.
  • Semantic HTML: This is optional but in my view it helps when the HTML code for an article page includes appropriate semantic HTML tags like <header>, <footer>, <nav>, and <article>. I have a strong suspicion it aids Google with indexing, by allowing easy separation of boilerplate code from article code.

Beyond these issues, every publisher will have their own unique challenges that impact crawling and indexing.

Q: In an age of AI-generated content and misinformation, what signals help Google determine trust and authority for news publishers?

A: It’s difficult for machine systems to accurately detect AI-generated content, so Google relies on the well-established EEAT signals to validate a website’s credentials and trustworthiness.

This is why having real journalists with named bylines is so important. The site also needs a good About section, with detailed information about the publisher’s history, background, editorial policies, physical location, and all other attributes that trusted publishers possess. Never assume Google will be able to determine these types of signals from external sources – always explicitly outline them on your own website.

It’s also worth noting that Google struggles to identify EEAT correctly through its algorithms. This is why Google relies on human quality raters to help identify EEAT signals. So when you implement EEAT elements on your website, always make sure you present them for a human audience first and foremost.

Q: Google Discover continues to be unpredictable. What can publishers do to improve their chances of being featured consistently?

A: That’s the million dollar question, isn’t it? Consistent success in Discover depends on being able to identify the key trends in Discover topics, align these with your editorial output and topic authority, and create engaging content that fulfils Google’s algorithmic requirements and meets user expectations. This requires a lot of data gathering and analysis from multiple sources, a lot of trial and error, and a willingness to experiment and think outside the box. 

Optimising for Discover is also inherently risky. The huge volumes of traffic Discover can provide, combined with the preference for what we’ll call ‘clickbaity’ content, means Discover often leads publishers down the path of ‘churnalism’: low value articles that are just retellings of the same story with slightly different spins and headlines, all geared towards eliciting clicks from the Discover feed. When publishers start focusing on Discover to the detriment of actual journalism, you see a rapid descent into clickbait and borderline spam. And it works – for a while. Publishers will see strong payoffs, in terms of Discover traffic, from following that well-trodden clickbait road.

However, inevitably it will catch up to them. Bad content accumulates negative quality signals, and those will eventually be expressed in one of Google’s regular core algorithm updates. The damage that such an update can cause to a website’s Discover (and search) traffic can be catastrophic, and is always very hard to recover from.

Despite Discover’s huge traffic potential, I always urge publishers to not focus their editorial strategy on this channel. Discover should be seen as a supplementary channel. A publisher’s key focus should be on delivering quality journalism first and foremost. That is the only surefire way to safeguard a site’s presence in Google’s ecosystem.

 

Q: From your point of view, how should publishers balance speed (breaking news) and depth (evergreen content) in their SEO strategy?

A: I’m going to answer this with the interpretation of ‘evergreen’ as ‘news-related background stories, analysis, and features’. There’s a whole other can of worms to be opened around evergreen for commercially interesting queries (recommendation lists, for example), but that probably deserves a separate interview entirely.

A few years ago I would have said that breaking news and evergreen are equally important and publishers should strive to produce both. However, the rise of AI Overviews in Google’s results has definitely impacted the value of evergreen content, especially for publishers that rely on a steady stream of new visitors to their website for monetisation.

Nowadays, breaking news is one of the very few areas where news publishers are immune from the effects of AI’s takeover of search. AI cannot report the news – it can only summarise the news after it has been reported. So publishers should focus on creating the news and fulfilling the audience’s immediate need for the latest stories and reporting.

Evergreen still can have value for your engaged readers who want to read background articles and analysis, but on their own such non-news articles will drive much less traffic to the site from search engines due to the prevalence of AI Overviews on informational queries. Now, the main purpose of evergreen content is to provide more depth of information for readers that are already on your site.
Barry Adams Quote

Q: If you could fix just one SEO problem for most newsrooms, what would it be and why?

A: In my experience, there are two types of incorrect SEO implementations in newsrooms: Either the newsroom doesn’t take SEO seriously enough (often coasting on the site’s reputation to drive rankings), or the newsroom is too focused on SEO and creates click-first content. Neither approach is healthy.

If I could solve one problem, it would be the accurate positioning of SEO in newsrooms. Search should not dictate editorial agendas, but should be applied as a meaningful layer on top of the journalistic output. This ensures every article gets the best chance to acquire clicks from Google’s various news-focused surfaces, while still focusing on delivering quality journalism to your readers.

Quality journalism also promotes the right signals for a site’s longevity, both in search and every other channel. It builds authority, trust, respect, and loyalty. When a site starts compromising on those principles, the first steps are taken on the road to ruin.

Conclusion

Barry has watched publishers rise and fall for two decades. The pattern never changes: those who chase clicks eventually get crushed by the same algorithm they tried to game.

The ones still thriving? They treat SEO as a layer, not a strategy. The journalism comes first. The optimization follows.

And here’s the uncomfortable truth for 2026: AI can summarize, rewrite, and regurgitate. But it cannot knock on doors, verify sources, or break a story. That’s still human territory.

 

Share

Picture of Marta Szmidt
Marta Szmidt

Marta Szmidt is an SEO Strategist with a focus on the iGaming industry. With a background rooted in strategy development, she continuously adapts to the evolving digital marketing landscape. Her analytical approach relies on data and industry trends to make informed decisions. Explore her insights and analyses to decode the complexities of today's SEO challenges and opportunities.

Related SEO articles