Jump to content

Add page numbers in 1.6 ?


Mister Denial

Recommended Posts

Hello,

 

Tomer had this really great guide about how to prevent duplicate content errors by adding page numbers to the page title metatag: http://www.presto-changeo.com/en/content/6-prestashopseotips

 

But in 1.6 the code is different.

 

How can merchants create a unique page title / description in 1.6 ?

 

Thanks for your help, and kind regards,

 

Daniel

Link to comment
Share on other sites

It really is not a PrestaShop issue, it is an issue with how search engines are treating content. Here are a couple of links that might can explain it better than I can. What I suggest to my clients is to rel=canonical to the main category page and no-index the paginated pages. 

 

http://searchenginewatch.com/article/2240729/Why-Thin-Content-is-Hurting-Your-Ecommerce-Site-How-to-Fix-It

 

  • Like 2
Link to comment
Share on other sites

Ahh ok, give me a little bit and I will get it wrote up as a tutorial and get it posted to you tonight. 

 

That might be an interesting post for your blog, no? Two approaches to the duplicate title / description issue in Webmaster Tools, with option 1) adding page numbers or option 2) rel=canonical on main category page, no index on the rest

 

That way people could chose according to their preferences. Personally, I initially wanted to add the page numbers, but after the link you posted about thin content, I think I would go with the rel=canonical / no index solution (if I knew how to implement it correctly).

Edited by Mister Denial (see edit history)
Link to comment
Share on other sites

Hey, I don't want you to think I am ignoring you, I actually don't know how that google handles the paginated pages with the # alone. I posted a question to the people on the Moz forum for advice, you can check it out here, http://moz.com/community/q/wrapping-my-head-around-an-e-commerce-anchor-filter-issue-need-help  No one has answered yet, but I am looking forward to hearing what people have to say. 

Link to comment
Share on other sites

Sorry, maybe the new 

rel="next" and rel="prev" is what might tackle the problem?

https://support.google.com/webmasters/answer/1663744?hl=en

 

 

 

  • Use rel="next" and rel="prev" links to indicate the relationship between component URLs. This markup provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page.

Regards, Trip

Link to comment
Share on other sites

I am not sure that it would work with an anchor though. But even if you do use them you are exposed to duplicate content because the ratio of on page content from page to page is high, ie the category description does not change, ect. They are really made for multipage articles and for sites where the category description can change from pagination to pagination.

Link to comment
Share on other sites

Yes maybe, I am quite positive as mentioned in the article "

  • Do nothing. Paginated content is very common, and Google does a good job returning the most relevant results to users, regardless of whether content is divided into multiple pages.

"  so I would wonder google does not "understand" the URL pattern and you get an penalty or whatever. It will not hurt when you use the extra markup. It's a little late now here and I haven't dived into the topic to much. I think duplicate content is much overrated. When you get a couple of warnings in wmt for a couple of pages with same meta descriptions so what? There is nothing against optimizing and I could reduce thousands of these warnings on my prodcut pages just by adding the product reference code in the metas. Does this give extra value for the customer? No. Did the warnings disappear? Yes. Do I rank better? Not sure ;)

All the best, Trip

Link to comment
Share on other sites

Well atm I can't draw a connection between Panda update and dc. What I wanted to point out is that adding unusefull information to metas is a way to reduce the gwt warnings but really does not enhance the qualtiy of a site and I am quite positive google bot understands common website patterns may it be blogs or ecommerce websites or whatever. 

Googles focus is imo to provide best search results for the users hence not to penalize websites because they forgot page numbers in the title and metas of paginated pages. 

I believe google values "quality signals" like rich snippets, extra markup etc.pp. but two quotes from the horses mouth:

 

 

Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don't follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.

and last but not least:

 

 

You can help your fellow webmasters by not perpetuating the myth of duplicate content penalties!

http://googlewebmastercentral.blogspot.de/2008/09/demystifying-duplicate-content-penalty.html

 

Best regards, Trip

Link to comment
Share on other sites

@ Dh42: no worries Lesley, didn't think you were ignoring me. ;-) I understand this is a rather complex issue, which is why I think this might be of much interest to other merchants and web-masters, as obviously there is much uncertainty about how to handle the fear of duplicate content.

 

After reading all the articles you and Trip linked, my preferred solution as merchant would be to make the first page of a category the one that matters most, and signal this to Google by placing a canonical tag on it, and making the subsequent pages noindex. That would be my solution of choice, because:

 

1) it gives me more control over which page Google actually displays

2) it gives me the ability to turn the first category page into a real landing page

 

I agree that Google might be able to understand that subsequent pages are just a continuation of the first page, and might not slap a penalty on your site because of that, but the article also says it can affect performance. And with Google, better not take any chances.

 

I think this would be a really interesting blog article, debating the pros and cons of using page numbers vs canonical / noindex of sub-pages, and maybe offering a solution on how to implement both.

 

Because making p= pages noindex in robots.txt is easy for most users, implementing the canonical tag is not. I for instance am not sure on how to do this. I know in PS 1.4 there was some sort of canonical url option, but I don't think that ever really worked.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...