What are URL Parameters?
URL parameters – aka query strings, URL query parameters – are the elements that are part of a URL (especially in an e-commerce website) and are used for traffic tracking, content structuring and sorting.
Query parameters make it easier for website visitors to navigate through the website and sort the products quickly on an e-commerce website. They also allow the users to apply various filters on a page and view different items.
Besides that, URL parameters are also used to track the traffic on a website which is important to align a marketing campaign.
The basics of URL parameters
Query or URL parameters usually come after the question mark “?” and contain “key”, “values”, “&”, and “=”. Multiple parameters can be added to a URL and they are separated using “&”.
A typical URL with parameters looks like this:
https//www.yourwebsite.com/page?key1=value1&key2=value2
https//www.yourwebsite.com/women-shoes?color=red&sort=priceasc
In the above URL
“?” shows where the query string begins
“Key” shows the variables which are “colour” and “sort” in the example
“Value” shows the value of the variable which is “red” and “priceasc” in the example
“&” or ampersand is a parameter separator
“=” is value separator
How do URL parameters work?
According to Google, there are two types of URL parameters.
Active parameters (content modifying parameters)
These parameters modify or reorder the content of the page. Their common uses are:
Filtering and sorting: On an e-commerce website, these are used to filter or sort the products and present more specific content to the users.
?color=blue
?sort=highest_rated
Pagination: The content is divided into a series of multiple related pages.
?p=2
Translating: Used to change the language of the content on a page.
?lang=de
?lang=en
Searching: Used to find a particular piece of information on a website's internal search bar.
It can be:
?q=seodebate
?s=seodebate
?search=seodebate
?query=seodebate
Passive parameters (Tracking parameters)
Passive parameters do not change or reorder the content of a page, instead, they are used for the tracking of website traffic.
Their common examples include:
Affiliate IDs: These are used to identify the traffic source from where it came.
?id=seodebate
Session IDs: This is another method to track a particular user using session IDs, however, it is not a common practice nowadays.
?sessionid=12345
Advertising tags: These are used to track the traffic from advertising campaigns.
?utm_source=newsletter
URL parameters and SEO
Usually, it is a good practice to avoid the URL parameters because they can impact your website negatively in several ways.
They may slow down the crawling process of your website. Moreover, websites having several passive parameters - where the content of the page remains the same – create multiple URLs with no unique and valuable content.
Issues caused by URL parameters
The most common issues that can occur due to URL parameters include:
Content duplication
Search engines treat every URL as a separate page. When URL parameters are used, they create several URLs with no modifications in the content. As a result, they will create the issue of duplicate content.
The content that appears on a page after reordering is almost similar to the original ones, so it will be considered a copy of the original page.
Impact on crawl budget
Google emphasizes keeping the URL structure simple and descriptive. When there are multiple versions of URLs and all are pointing towards the same page, it will negatively impact the crawl budget as crawlers see them as low-quality pages.
Crawlers, instead of wasting the budget on the same pages will move on to the others.
Overly complex URLs, especially those containing multiple parameters, can cause problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site. As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.
Ranking difficulty
When there are several URLs on your website and all are pointing towards the same page, it may confuse the crawlers - leading them to not crawl your desired page for a particular keyword.
If other websites are linking to your website through a parameterized URL, it will split link juice among several versions and your primary page will have a diluted page ranking.
SEO best practices for URL parameters
Link only to the static page or main page
If there are parameterized URLs on your website, make sure to internally link to only the main or static page. In the below example, there are several versions of a URL.
You can link to only one that is the main page or URL and avoid sending mixed signals to search engines.
Main or Static URL
www.yourwebsite.com/ladies-bags
Search query URL
www.yourwebsite.com?q=ladies-bags
Product type filter
www.yourwebsite.com/ladies-bags?type=totes
Colour-filtered URL
www.yourwebsite.com/ladies-bags?color=blue
Here the static page is www.yourwebsite.com/ladies- bags, so you must link to only that one rather than linking to other parameterized versions.
Use canonicalization method
When there are several versions of a page, you point towards the main page by using canonical tags.
Once you have decided which one is the main or static page, now you can use canonical tags on all other versions of URLs to point towards the main or canonical page.
In the above example, all parametrized URLs must include canonical tags pointing towards the static page (www.yourwebsite.com/ladies- bags). It tells the search engines that this is the preferred version of a page.
Use robots.txt to block crawlers
You can use robots.txt to block the crawlers from accessing the parametrized URLs on your website. You can instruct the bots by adding this tag:
Disallow: /*?tag=*
It will block all the crawlers from crawling the URLs that have a question mark in them. While using this tag, make sure to check that no other important URL has a question mark, otherwise, they will be blocked.
In 2009, Google introduced the URL Parameter Tool to handle the parameterized URLs by ignoring specific parameters or their combinations.
However, in 2022, Google deprecated this tool saying they are much more efficient now at guessing which parameters are useful and which are not.
When the URL Parameters tool launched in 2009 in Search Console’s predecessor, Webmaster Tools, the internet was a much wilder place than it is today. SessionID parameters were very common, CMSes had trouble organizing parameters, and browsers often broke links. With the URL Parameters tool, site owners had granular control over how Google crawled their site by specifying how certain parameters affect the content on their site.
Over the years, Google became much better at guessing which parameters are useful on a site and which are —plainly put— useless. In fact, only about 1% of the parameter configurations currently specified in the URL Parameters tool are useful for crawling. Due to the low value of the tool both for Google and Search Console users, we’re deprecating the URL Parameters tool in 1 month.
Takeaway
It is important to handle the URL parameters to avoid duplicate content issues and passing the link juice to unnecessary URLs.
As a good SEO practice, it is good to choose descriptive and meaningful names for URL parameters and use canonical tags to consolidate duplicate content that may be caused by URL parameters.
Moreover, you can also tell the search engines which parameterized URLs should not be indexed by using the rel= "noindex" attribute.