Jump to content


  • Posts

  • Joined

  • Last visited

Shhhh's Achievements


Newbie (1/14)



  1. Yes, you are right! Additionally to that, adding a "free shipping" product + a non free shipping product would also reduce the total shipping costs. This solution, besides the fact that its not suitable for all situations, should only be applied for special products, not for an entire website. Managing these costs can be quite an issue too.
  2. Hi jonsen1 That is not a recommended way to do redirection! You should really change it if you have it in that way. Use one of the examples above instead!
  3. Hi guys, I am thinking about a different solution to the meta description/title duplicate issue for pagination (besides adding page number or order type in them), a solution that would be appreciated by Google (and others) too. The objective would be to add the recommended rel=next and rel=prev in the header of the pages. Issues I had in mind: - pagination is "calculated" later and link tag must be in the header (how to get the links for next and prev?) - first and last page of a set must not contain prev and next Could any of the experts in here provide some hints where to start off in this "quest"? It would be helpful for many people. Thanks
  4. Hi, Sorry, no idea why you have so many errors. The solution I posted was for module version 1.7. Maybe you should update the module to this version? What you could do is to check the errors and see what is duplicates are. Then you could figure out what is the reason and a possible solution.
  5. and a question: did you managed to have this working with non free shipping? if yes you could use the "additonal_shipping_cost" negative value trick I mentioned here.
  6. Hey carapa, Do you really need the 5 page checkout? Try it out with the one-page checkout and see if it works.
  7. Additionally, you might want to add $smarty->assign(array('nobots' => true)); in your sendtoafriend2.php file, a the top of the displayPageForm function It made me feel a little more secure that those stupid pages wont get indexed again
  8. hey netavenue I think my post here should be what you're looking for,
  9. It seems the best help you can get is only from yourself... For those interested in a solution (other than editing your sitemap manually) here is what you need to do: Edit /modules/gsitemap/gsitemap.php go to line 196 (or search for $pages) comment line pointing to /index.php and any other page you want out of your sitemap (a good idea is to comment out those pages that are restricted by your robots file) $pages = array( 'authentication' => true, //'best-sales' => false, 'contact-form' => true, //'discount' => false, //'index' => false, 'manufacturer' => false, 'new-products' => false, 'prices-drop' => false, //'supplier' => false, //'store' => false ); NOTE: careful not to comment out the ");". move it on an empty line. Generate your sitemap and submit it. Should work.
  10. Strange thing is that the second entry has different data compared to the first entry. I know it can be removed manually. That was what I wanted to avoid...
  11. Hi guys, Due to the new changes in Google webmaster interface, you are no longer allowed to submit a sitemap that has a duplicate URL. Unfortunately, there is a duplicate in the generated sitemap, namely the root of the site (www.site.com/ appaers twice). Anyone knows how to remove the second entry? Would be highly appreciated! Thanks.
  12. Hi Mike, Any news on that feature? I'm on 1.4.5 and it still deletes all missing info. Thanks
  13. I looked in there but I couldn't find any reference (checked the internal links) Any hints where they should be? PS: the report is from a different crawler (SEOMoz)
  • Create New...

Important Information

Cookies ensure the smooth running of our services. Using these, you accept the use of cookies. Learn More