Jump to content

Layered Navigation block disallow


viktor123

Recommended Posts

Hi, 

 

I recently noticed that Google webmaster tools shows duplicate content in all pages with layered navigation results (sizes, colors, etc). Is there any chance to block these results from robots.txt or noindex them. The results use selected_filters attribute.

www.domain.com/10-somecategory?n=10&selected_filters=-xs&id_category=10&p=2

Edited by viktor123 (see edit history)
Link to comment
Share on other sites

What i did on a clients site to combat that is write  little smart script that uses regex to generate a canonical url. 

 

Ok, that sounds like a solution. I tried adding in robots.txt 

Disallow: /*controller=selected_filters

Disallow: /*controller=filters

but, that probably might not solve it. 

Any chance I take a look at your script?

 

Thanks

Link to comment
Share on other sites

Sure, it is really rough looking an part of a much larger script, but this is what I have for it so far.

{assign var="can" value=$smarty.server.REQUEST_URI|regex_replace:"/&n=\d+$/":""}
{assign var="can1" value=$can|regex_replace:"/&id_category=\d+$/":""}
{assign var="can2" value=$can1|regex_replace:"/\?selected_filters=/":""}
{assign var="can3" value=$can2|regex_replace:"/\?id_category=\d+$/":""}
{assign var="can5" value=$can3|regex_replace:"/n=\d+&/":""}
<link rel="canonical" href="http://{$smarty.server.HTTP_HOST}{$can5}"/>
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...