Google have made an update that is going to reverberate around both the SEO and AI industry, and they didn’t even bother to announce it.
This to me, seems like a direct action to control what is happening with LLM (large language models) sites such as ChatGPT stealing their much-coveted search proportion.
What has happened in this latest Google update?
Traditionally Google have allowed the use of an additional part of their search URL to show 100 results at once with the addition to the URL of “&num=100”. There are many other parameters that you can use, such as to give a specific date range, open a new window or show the location of a specific search. However, none of these carry the importance to the wider web as the number 100.
That is because this is the way that bots have traditionally crawled multiple pages in the search index with ease and have access to the top 100 results without having to go to page by page. When I say bots, this will mean all the SEO tools such as Ahrefs and SEMrush and Moz, but also the LLMs such as ChatGPT and Claude.
The knock-on effects from this are a dramatic drop of the impressions to all websites and a large increase in the average position as found in Google Search Console.
An impression I counted whenever a URL is visible on Google search even if not scrolled to, meaning if you are on page 1, you will get an impression for every search. As the top 100 results aren’t being shown anymore to all of the bots then significantly less impressions are seen overall to every URL, but the ones that can be seen are more likely to be in the top ten positions, as the only way you can be seen below position 10 (occasionally Google shows +10 results in search anyway) is by clicking on the page 2 button.
What do these changes to crawling and data collection mean?
The first thing that I think it means is that we (SEOs) have actually been looking at the tracking parameters from Search Console wrong for years. Rank tracking tools have existed since SEO has and therefore have always artificially inflated impression data and deflated the average position. But at least we as an industry had a standard and an expectation we could work to.
This has been layered with the explosion of different LLM bots, as well as the introduction of AI overviews and how they’ve increased impressions for users as well.
In short, it’s been a rollercoaster few months when it comes to looking at Google Search Console data.
But does it actually matter? If they are only bots giving these impressions this means it doesn’t affect your bottom line or the conversions you get. What we’ve seen is it increases engagement on site, because if the AI overviews are answering some of the top of funnel inquiries this means that those that come to the websites already closer to the purchase point.
We know that sessions have dropped as well because of these top of the funnel queries are no longer being taken to the site if their query gets answered directly by AI overviews, but this hasn’t impacted the actual total conversions.
So, if you were looking at it from a zoomed-out point of view then you can maybe shrug your shoulders and focus on the bottom line. However, from working in SEO for 10 years I know we struggle in the industry to zoom out that far and instead the tendency is to zoom in as it let’s us see the small details that actually make a difference. It is starting to feel like we are losing some of that capability, namely in the accuracy of our data points of reference such expected CTR, or maybe when a page is climbing the rankings, but not yet in the top 10.
Because it’s not just Google Search Console that’s receiving the disruption. The rank trackers themselves are struggling to keep up now. Ahrefs, one of the biggest rank tracking and general SEO platforms is currently only showing the top 10 results. And from a recent tweet from their CMO maybe we’ll look to only extend that to the top 20. Why? Because of the cost of going through every single page, of every single possible search result to be able to service all their millions of different keywords to customers that they currently track globally. Moz, have said that they will accommodate, but the shorthand is it will 10X the resource they require to handle it.
Maybe we don’t need the 100 results and maybe it’s only when it’s being taken away from us, we think that we’re missing something important, something, that if we were never given it in the first place we wouldn’t think is important. I do however think that it serves some purpose, in showing the top 100 results it shows from a SEO practitioner’s point of view that if a page was not appearing in the top 100 at all you’re completely missing the point content wise or have a technical issue. Then you can take action and start to create the relevant content or fix any technical issues and you’ll see yourself climb, maybe not into the top 10 straight away but at least upwards, then you get into these top 20 or so positions where you really need to have a closer eye on what your competitors doing differently. Whether that be Digital PR presence or appealing to other search features or content types that may be your brand has not focused on.
At this point I’ve not even touched on the impacts on the LLMs, if you are ChatGPT and you were regularly scraping Google search results to get results from the top 100 or so pages the user is now losing that capability and access to data that it once had, unless ChatGPT is able to think of a way that they can crawl the web as a whole better. As Google has been in the game for so long, with such enormous capital behind it, the LLMs are still a long way behind when it comes to crawling and indexing the whole web.
Where they differ from the rank trackers is that they will store the information that they find and be able to process it directly from their own database of information potentially meaning in the long term they can bypass Google completely but it’s another big speed bump that these LLMs need to overcome to really penetrate the market and take off some of Google’s share.
And do you can bet that Gemini and AI mode I’m not affected by the same constraints as the rest of the web, which is another action from Google that helps them solidify its place as the dominant force on the web, at the same time as Gemini becoming the top downloaded app on Apple App Store and Google Play.
What’s next?
I think we will see what long term and short-term impacts of this. The short term being maybe a couple of months of disrupted reporting and explaining what the disruption in stats actually mean, while we try to find some helpful insight that we maybe previously had.
Medium to long term I believe the rank trackers and LLMs will find another way round that’s what they’re here to do whether that cost is put on to the consumers or is absorbed by tools themselves is yet to be seen, but most likely a combination of both. You could even call it a ‘Google tariff’.
And then finally in the long term we will have new standards of data new expectations and ways to measure. But for the LLMs some of them may struggle to keep up, I don’t believe that will be ChatGPT or some of the other market leaders but if you had launched a new version in the last month or so, there has got to be a much higher barrier to entry now.
I think the final kicker of it all is that Google just released this update without saying anything, without putting it on their search update dashboard and has seen it reverberate around industry with only a small comment of it not being something they formally support. Because they know what they’re doing will negatively affect not just the LLMs they are probably looking to monopolise on, but all of those that have spent years trying to make search results better.