Warning: Parameter 2 to wp_hide_post_Public::query_posts_join() expected to be a reference, value given in /home2/wacofu0abpry/public_html/wp-includes/class-wp-hook.php on line 286Dealing With E-commerce Product Filters | SEO RaveWarning: A non-numeric value encountered in /home2/wacofu0abpry/public_html/wp-content/themes/Divi/functions.php on line 5841

Dealing With E-commerce Product Filters

I often hear SEOs talking about how difficult e-commerce SEO is. I think this is usually because of the size of e-commerce websites when you run a crawl on them and they return thousands and thousands of pages.

But you shouldn’t freak out because of the number of pages in a crawl.

E-commerce SEO is dead simple and should be no more complicated than your average content-based website. That is, as long as you sort out one of the major things that can make it an SEO nightmare:

Product filters.

Product filters on e-commerce sites provide great functionality (I use them all the time!) for your customers by easily allowing them to filter through your product range and view the types of products they are looking for.

If you can sort out product filters so they retain full functionality and cause no SEO issues, e-commerce SEO is simple.

How bad could product filters really be?

I’ve audited some of the biggest e-commerce websites in Australia and the most common issue I’ve come across is caused by product filters generating duplicate pages – sometimes in the millions. Yes, millions!!

In the most extreme case I’ve seen, there were more than 25 million crawlable pages and more than 500k pages indexed in Google. And this site only had about 7,000 products…

The site was also using four Magento installs (I have no idea why), no canonical tags, had no robots.txt file and all pages were set to meta robots “index, follow”. This site was like a black hole for search engines; once they entered, there was no escaping and they are only pulled in deeper and deeper.

How can you fix it?

While having a site that is virtually impossible to crawl will cause major issues for SEOs, they are also very easy to fix.

E-commerce filters typically create dynamic query string URLs, which are fine, but most filters also tend to have an ‘SEO-Friendly- URL structure option. They call it ‘SEO-Friendly’ because it adds your filter selections as additional folders to the URL. This cleaner, more readable URL means it’s better for SEO, right? Nope.

This ‘SEO-Friendly option is not SEO-friendly at all, in fact, it is usually the beginning of your SEO nightmare because they make it much harder to control which pages are being crawled.

Have correct canonical tags and a good robots.txt file

Canonical Tags

Magento deals with product filters very well if you simply enable the canonical tag feature. This article shows you how to enable canonical tags in Magento.

With canonical tags enabled, product filter selections will be canonicalized back to the page you started filtering on, no matter how many filter options are selected. Perfect!

BUT if you have selected the ‘SEO-Friendly’ option for product filters, the filter URLs will canonical back to themselves instead of the original page. You do not want this!

Once you’ve fixed the duplicate and infinite indexable page issues caused by the product filters, you could still be allowing search engines to crawl millions of canonicalized URLs.

This will waste your crawl budget and will provide a confusing experience for search engines when they see that a high percentage of URLs in are canonicalized to other URLs.

You can easily save your crawl budget by adding a few simple lines to your Robots.txt file.

Robots.txt

It can be as easy as adding the adding this single line to your robots.txt file:

Disallow: /*?

This will block all of the dynamic product filter URLs from being crawled, but this can also block paginated pages from being crawled on your site – depending how they are handled.

You don’t want to do this because it won’t allow all of your product pages to be crawled and authority from the rest of the site won’t flow through to them.

You can usually fix the issue of blocking paginated pages by adding the following to robots.txt:

Allow: /*?p=

How effective can these changes be?

On a client’s site that was in excellent shape apart from the duplicate issues caused by product filters, we saw a 100% increase in organic traffic within 2 months and have continued to see consistent monthly growth since the changes were implemented.

Takeaways from this post

One of the fundamental aspects of SEO is controlling the way search engines crawl and index a site. If you know how to do this well, e-commerce SEO is simple and nothing be scared of!

Alternatively, if you don’t get this right you can have major SEO problems and your organic traffic will take a big hit.

Got any other e-commerce SEO issues? I’d love to hear about them and write a post about the most common issues people are having.