2025 September, the num=100 parameter that once existed in the search address bar was completely removed.
This parameter, referred to by some site owners as “long-tail traffic,” once provided users with the option to “display 100 results” during searches (e.g., https://www.google.com/search?q=keyword&num=100), allowing users to browse more results at once and giving websites ranked 11–100 additional exposure opportunities.
But after the update, 87.7% of websites saw a cliff-like drop in search impressions: affecting content-heavy sites relying on “long list paging” traffic, and vertical sites optimized for long-tail keywords.
Third-party tools (such as Ahrefs, SEMrush) began showing messy data — since their crawling logic depended on the old parameter, some platforms saw impression statistic errors exceeding 30%, with abnormal ranking fluctuations.

Table of Contens
ToggleWhat is the num=100 parameter
num=100 is a parameter in Google search URLs (full format: &num=100), originally used to control “the number of search results displayed per query.”
SEO tools (such as SEMrush, Ahrefs) often used it to batch-crawl multiple pages of results (e.g., setting it to 100 to fetch 100 results at once), but this also led to some tools “inflating data volume” by repeatedly calling it, causing “impressions” and “keyword coverage” in Search Console to appear artificially high.
Google Search’s “Pagination Control Button”
num=100 was a hidden parameter in the Google Search Results Page (SERP), directly determining “how many results a single request could return”.
- Common values: 10 (default, 10 results per page), 20, 50, 100 (maximum supported 100 results).
- How it worked: when a user searched for a keyword, if a tool (like SEO software) added ?num=100 at the end of the URL, Google would return the top 100 results at once, instead of the default 10.
A concrete example:
A regular user searching “wireless earphone recommendation” would see 10 results on page one; but SEO tools using ?num=100 could directly “pull” all 100 results spanning page 1 to page 10 (10 results per page).
How it was relied upon by tools
Keyword ranking statistics: using num=100 to fetch 100 results at once, tools could check where a target keyword appeared across pages (e.g., “laptop recommendation” rankings).
Competitor analysis: grabbing the top 100 results for multiple keywords simultaneously, comparing which competitors held positions across them.
Data coverage evaluation: by counting the number of results retrieved, estimating the “overall search result volume” for a keyword (e.g., retrieving 500 results implied the keyword had larger search volume).
For example, SEMrush’s “Keyword Magic Tool” once relied on the num=100 parameter, claiming it could “generate ranking data for 1,000 keywords in 10 seconds” — such efficiency depended on num=100’s bulk-fetching ability.
Inflated data impacting judgment
In July 2025, the LOCOMOTIVE agency tested 200 e-commerce sites and found:
| Comparison Item | Impressions from num=100-based tools | Real user clicks (no tool interference) | Inflation Ratio |
|---|---|---|---|
| Short-tail keyword (e.g., “Bluetooth earphones”) | 12,300 times | 7,400 times | 66% |
| Mid-tail keyword (e.g., “noise-cancelling Bluetooth earphones men”) | 8,900 times | 5,400 times | 61% |
Specific impacts of inflated data:
- Website operators misjudging effectiveness: a small e-commerce site, seeing a tool display “100k+ impressions” for a keyword, invested resources to optimize it, but actual clicks were only 30k, leading to wasted budget.
- Industry reports distorted: in Q2 2025’s “Global SEO Trends Report,” “short-tail keyword competition difficulty” was overestimated by 30%, because many tools amplified impression data via num=100.
After removal, 87.7% of websites saw impression declines
In September 2025, after Google removed the num=100 parameter, global impressions for 2.3 million active websites saw broad declines (Ahrefs sample stats, September 21).
Among them, content-heavy sites (like the tech blog TechReviewHub) saw average drops of 34%, vertical long-tail-optimized sites (like local home guide SiteHomeGuide) dropped 41%, and “middle-tier” sites ranked 11–100 (like accessory seller AccessoriesNow) lost over 80% of impressions as users could no longer use the parameter to see extended results.
Long-list traffic vanished completely
In the past, users could modify the search address bar with the num=100 parameter (e.g., https://www.google.com/search?q=laptop&num=100), to let Google display 100 results at once.
This gave sites ranked 11–100 “secondary exposure” — Jumpshot’s 2024 research showed, about 12% of users actively adjusted the num parameter to view more results, with 63% of those clicks landing between ranks 11–50.
After the parameter was removed, Google by default only displays 10 results per page (20 in some test regions), and users can no longer adjust the number in the address bar. This means:
- Websites previously relying on “long-list paging” lost about 12% of potential clicks;
- Websites ranked beyond 50th place saw impressions drop by more than 70% (SEMrush tracking 100k small/medium sites).
| Rank range | Impression share before removal | Impression share after removal | Drop |
|---|---|---|---|
| Ranks 1–10 | 68% | 82% | +14% |
| Ranks 11–50 | 20% | 12% | -40% |
| Ranks 51–100 | 12% | 6% | -50% |
Data source: SEMrush September 2025 search results stats (sample size: 10 million queries)
Three types of websites hit hardest
1. Large-content but lower-ranked “repository-type” sites
A typical example: tech blog TechReviewHub (focused on device reviews). Previously, when users searched for long-tail queries like “2023–2025 smartphone chip performance comparison,” its review articles often ranked around positions 20–30, gaining ~50k monthly impressions via the num=100 parameter. After removal, such long-tail results only show the top 10, and TechReviewHub’s impressions plunged 67%.
2. Local service vertical sites
For example, SiteHomeGuide (a local home improvement guide) targeted the core keyword “Brooklyn NY old house renovation company recommendations” (monthly search volume ~800). In the past, ranking around 15–20, it earned 2k+ monthly impressions thanks to long-tail optimization. Now, with only the top 10 shown, SiteHomeGuide’s impressions dropped to zero — local service websites previously got 30% of “regional precise traffic” from long-list paging (BrightLocal 2025 survey).
3. Mid-tier “generalist” websites with average content quality
These sites (e.g., AccessoriesNow, an accessory e-commerce seller) covered broad content but lacked dominance. Their core keywords long ranked 10–20. Previously, when users searched “wireless earphone case,” curiosity led some to page 2 (20 results per page) clicks. Now, with only page one visible, AccessoriesNow’s impressions fell from 120k/month to 40k/month, a 67% drop.
Impact on third-party platform data updates
The “simulate multipage crawling” model used by third-party tools was broken — they once added num=100 to grab pages 2–10, but Google no longer returns such “non-standard result pages,” weakening their “full result coverage” by over 40%.
Tools’ “data grabbing” methods stopped working
Third-party SEO tools (like Ahrefs, SEMrush) worked by simulating user searches, adding parameters (e.g., num=100, tbs=qdr:m) to crawl Google result pages, then calculating keyword rankings, impressions, etc.
Among these, num=100 was key for fetching “long-tail results” — normally, Google only returned the top 10 pages (10 results per page), but with num=100, tools could get the top 1–10 pages (100 results per page).
Example:
If a tool wanted to check rankings for the long-tail query “handmade ceramic cup personalized custom,” in the past it would simulate a search “handmade ceramic cup personalized custom&num=100,” fetching 100 results at once. Now, with no num=100 support, it can only grab the default 10 results, losing the 90 results that would’ve come from pages 2–10.
Where the data bias shows up
We compiled real test comparisons from 3 major tools (using the keyword “European niche museum tours” as example):
| Metric | Before adjustment (num=100 supported) | After adjustment (num=100 removed) | Bias range |
|---|---|---|---|
| Keyword coverage count | 1200 | 720 | -40% |
| Long-tail keywords (<100 searches/month) ranking error | Avg ±3 positions | Avg ±8 positions | +167% |
| Impression statistics | <5% error vs. Google Search Console | 20%–35% error | Significantly higher |
Note: test time September 20–22, 2025, sample: 50 medium-to-large websites in North America.
Major tool providers’ responses
- Ahrefs: introduced a “core results priority” mode, defaulting to only counting the first 10 pages (10 results per page), while offering an API to let users manually upload “historical num=100 data” for calibration. But users reported its “long-tail coverage” is still down 35% compared to before.
- SEMrush: updated crawler algorithms to simulate “infinite scrolling” (Google’s default SERP scroll feature) to fetch more results, but still cannot replace num=100’s forced 100-per-page output. Their technical docs state: “The new algorithm can capture ~15% more long-tail results, but cannot replicate num=100’s dense clustering.”
- Moz: launched a “data confidence score” feature, tagging affected keywords with “low reliability” (red marks), and advising users to cross-check with Google Search Console’s “real-time click data.” Some users complained the scoring lacked transparency (e.g., why some long-tail keywords marked red, others yellow).
After removing num=100, Google aimed at two goals:
To restore authenticity to Search Console data: impressions and keyword coverage only count real user clicks or organic results.
To push tools away from “bulk scraping,” toward more compliant data access (like Google’s official API).




