
Google just removed &num=100. This is what it means for Organic teams.

What is &num=100, and what changed in September 2025
This month we saw Google disable the &num=100 URL parameter which allowed users (and scrapers) to see 100 search results on one page instead of the standard 10. This seemingly minor tweak has caused chaos in the SEO community over the past week and a half as it’s an important tool used by rank tracking tools and the impact goes beyond these platforms and directly effects the reports we produce regularly for the different websites we look after.
Essentially, what this means is that it is no longer possible to retrieve 100 results in one go, to see these 100 results, you need to submit 10 different page requests.
This blog post will break down what has changed, what we expect to be the impact on the industry as a whole and how brands and marketers should respond to this change.
Why has Google made this change?
The most recent update from Google was:
“The use of this URL parameter is not something that we formally support.” – Google Spokesperson
Which isn’t exactly helpful for everyone who’s dealing with the impact of this. There are various theories as to why Google have done this.
Stopping mass scraping
This seems like the most obvious answer, Google wants to limit the automated scraping of the SERPs and &num=100 was a handy way rank trackers could scrape the rankings efficiently. Removing the parameter could stop these platforms scraping or at least reduce the volume of scraping.
Fighting back against AI data harvesting
In a similar vein, some people have suggested that the specific platforms Google want to stop scraping their results aren’t in fact rank trackers, but it’s AI platforms such as Chat-GPT and Perplexity which have been used to train the AI and also provide answers. Rank trackers are an unfortunate casualty of this decision. If this is the case, it’s a clear statement from Google that they feel threatened by the rise of LLMs and they want to protect their market share.
Protecting their infrastructure
Could Google just be fighting back on the impact these scrapers have on the overall load the website is experiencing? Deterring these scrapers would help Google improve stability and speed of the website and ensure real users have a better experience.
Pointing users towards Google’s own tools
This feels like it’s a very late decision, considering 3rd party rank trackers have existed for years, but could this decision from Google be to force us as organic marketers towards GSC’s position tracking? Or are they planning on releasing their own paid-for rank tracker (honestly, this wouldn’t surprise me).
We’re unlikely to get an actual concrete answer from Google, but I’d imagine it was a mix of all of the above. What we can take from this is that Google will continue to evolve and they want to protect their ecosystem strategically.
The immediate impact on 3rd party tools
The biggest impact, and the one most people are talking about, is that on rank trackers. These tools are arguably the biggest users of the &num=100 parameter, so removing it has fundamentally altered the mechanics of how they gather data.
The immediate fallout has been widespread, with various tools releasing statements and error messages on their performance, including SEMRUSH and Ahrefs. Both of these platforms are focusing on the fact they still have accurate data for the first page of Google, and they highlight that this is where the most value is, but ultimately these platforms have lost a huge % of their database.
If you use any rank tracker tool, this is what you need to be mindful of going forward:
Potential increase in costs
Without the shortcut to get 100-results, rank trackers face a 10x increase in requests for the same data. If platforms decide they still need to fetch all 100 results, this would incur a much higher cost for the platform which they’d likely pass on to the users via higher subscription costs.
A reduction in data available for reporting
If the platforms decide to not run 10x the crawls, they may instead just focus on the Top 20 or even Top 10. If this is the case, cost should remain unchanged but it depends on how much data you need to report on for your brand or you client.
A combination of both approaches
Some tools have already outlined a hybrid approach, including Accuranker who will be tracking the top 30 results daily, but the top 100 will now only be tracked twice a month, and SEOMonitor who will report on the top 20 results daily and the rest of the top 100 weekly. This allows them to still provide (somewhat) up to data data, but limiting the cost increases.
Ultimately, when using rank trackers you need to expect turbulence over the next few months as these platforms figure everything out. Some tools might temporarily have delays while their teams work to re-engineer their entire platform. I’d also expect to see price rises in platforms if you want to see beyond the top 20 results on a regular basis.
Impact on Organic reporting
Beyond third-party tools, the parameter change has had a big impact on Google’s own data. Specifically, Google Search Console performance reports.
You would have likely seen a sudden drop in impressions and a spike in average position since mid-September, this isn’t a reflection of accurate performance; it’s a direct impact to the removal of the &num=100 parameter.
Major drops in impressions
Tyler Gargula reported on Linkedin that 87.7% of websites studies saw a significant drop in impressions after this change was actioned by Google. This tells us that the people and tools who were using the &num=100 parameter were inflating a lot of our impression figures.
Google Search Console considers an impression any time a page is loaded, regardless of whether you scroll to that result or not. So when these scrapers were crawling all 100 results in one sweep, an impression would be counted. You’ll also notice that where you’ve seen your impression drop, it’s largely from desktop, which is where most of these scrapes occurred.
Increase in average rank
In almost the exact opposite to what we’re seeing for impressions, average rank has spiked across many accounts. This is again, because the full 100 results are no longer being scraped, so an impression isn’t being generated, essentially meaning GSC is “ignoring” the results from the pages that aren’t being scraped as often, “improving” the average rank.
This does bring up the age old question of “how useful is Avg. rank” from GSC, and I’ve always been of the same mindset that at a website level, this metric is trash, however if you look at it by query it can be really useful.
Is this a good thing for reporting?
It’s very early days, but this could mean we get more accurate data within Search Console. What’s likely been happening is we’ve seen inflated impressions in GSC that were never going to convert into a click, pulling our CTR% down.
If we look long term, this is great news for reporting, as we can get better insights on how our websites rank in the SERP and make more accurate decisions. Shorter-term, this does leave us as Organic marketers with lots of things to think about.
How should Organic marketers adapt?
The first round of reporting post this change is going to be a tricky one for brands and agencies, as visibility metrics such as impressions and rankings are likely to look much weaker.
The most important thing all Organic marketers should focus on this month is communication.
You must be telling your client exactly what has happened and how they can be affected, this helps you to avoid any uncomfortable conversations when they get their next report. It’s also important to be mindful of not using this change as an excuse for poor performance.
Practically, there’s a few things you should be doing moving forward.
Review your KPIs and reset expectations
If you’ve been reporting on metrics such as “total ranking keywords” or impressions as part of your KPIs now is an important time to have a conversation with your client about how these are impacted.
Communicate to stakeholders that dip in these numbers doesn’t necessarily mean a loss in performance and pivot your KPIs into those that are more meaningful such as sessions, conversions and revenue. If you absolutely need to focus on keywords, prioritise the top 10 rankings as we know these are likely to be the most accurate.
Ultimately, you need to make sure your KPIs are aligned with what matters most to the business, and moving away from visibility metrics could actually be a good thing.
Audit your rank tracking set-up
Check what your current platform has decided to do, are they now cutting off daily reports for the top 100 keywords? Are they still giving you the same data, and if so does this increase your monthly fee? Once you understand this, you can figure out your next steps.
If your tool has dropped down the frequency of reporting, do a deep dive to understand what this now looks like for your reporting. Understanding what this looks like ahead of pulling your next report for a client will enable you to be prepared.
Consider modelling your GSC data
There are a lot of people who use impression data in GSC as a very important metric, so moving away from this may not be possible, at least in the short term. If this is you, you might want to consider modelling your GSC data to “guesstimate” your performance over the next few months.
Not an ideal solution, but modelling your data on the avg % drop you’ve seen in impressions is going to be the closest you’ll get in terms of an accurate comparison.
What this means for brands
If you work brand-side, you might be reading this after your agency or in-house SEO has come to you telling you performance is down but there’s a reason why. Let’s break down the facts you need to be mindful of when you receive your next report.
Visibility metrics will likely look lower, this isn’t something to panic about
As outlined above, impression drops are going to be common in reports this month. However, the metrics that matter such as clicks, sessions, conversions and revenue remain unaffected. Ensure your team or agency has explained any unusual report changes in September so that internally everyone knows it’s an industry change, not a misstep in strategy.
Don’t focus on the number of keywords, focus on the quality of them
This change underscores that being listed for 1000 keywords is meaningless if they’re not on page one or two. Brands should gauge their SEO success by how visible they are where it counts, for example, what share of important category keywords you hold in the top results, or how your content is performing against competitors in the top 10.
If your internal dashboards included a lot of “keywords in 11-100” type of metrics, be prepared to shift those to “keywords in Top 10/Top 20” or similar. This actually can lead to more meaningful goal-setting and accountability for SEO efforts.
Use this as a moment to reset your Organic objectives
If you’ve previously focused on visibility metrics as core objectives for your Organic efforts, now’s the time to reset. It’s the prompt we need to ask ourselves strategic questions:
- Are we obsessed with ranking data at the expense of content quality or user experience?
- Are we chasing long-tail keyword rankings to bolster the number of keywords with minimal impact on organic traffic?
- How valuable are the keywords we’re ranking for?
Smart Organic marketers will use this as an opportunity to re-focus efforts on areas that drive growth.
The reality is, if a keyword is only being tracked for ego and not bringing any traffic, it might not be worth tracking at all. Freeing up that energy allows your marketing team or agency to concentrate on improvements that move the revenue needle.
Make sure you’re partnered with Organic experts who stay ahead of the game
This Google change highlights the importance of working with Organic teams who are on the pulse of everything in the industry. This isn’t the first, and will definitely not be the last curveball from Google.
If you’re a brand, you want your team spending time executing strategy, not scrambling to patch tools or interpret confusing data shifts. By working with an agency that’s on top of these developments, you ensure that when Google “changes the rules,” your strategy adjusts seamlessly.
For example, at Embryo, we immediately sent out an update to all of our clients and we’ve been working hard ahead of reporting period to ensure they go out as planned with no nasty surprises.
A partner who is proactive in monitoring, adapting, and communicating about such changes can save your marketing team a lot of headache and keep your SEO performance on track.
Google’s removal of the &num=100 parameter is a reminder that our industry is constantly evolving and we need to be mindful that Google can make changes as big as this regularly.
Moving forward, we expect rank-tracking solutions to continue evolving. Some will innovate to restore lost capabilities; others will double down on new metrics (e.g., “visibility indices” or Share of Voice in top 10) to demonstrate value in lieu of massive keyword lists.
Google itself may introduce official ways to get more data.
Regardless, by staying agile and working closely with your SEO team or agency, you can turn this situation into an advantage. Use it as a chance to re-strategise, ensure you’re prioritising high-impact work, and educate stakeholders on the quality of data over quantity.
If you’re struggling to get your head around this update and what it means for your brand, get in touch with Embryo, we can help!