Jump to content

Robot.txt and Multisite And Duplicate Content


Recommended Posts

Hello Everyone.

 

I have a multisite, with some extra stores that sells the same thing. The domain names are different but google has found it as duplicates.

 

Now im trying to index only main site, and disallow google from the other multistore urls.

 

But how can i do this?

 

Can i input disallow /example.com in robot.txt?

and still have my main store indexed.

 

Prestashop only has 1 robot.txt to all stores.

 

I have put this in the theme/multistorethemeiwantexcluded/header.tpl:

 

<meta name="robots" content="index,nofollow">

 

Thanks

Edited by Danielkjdaller (see edit history)
Link to comment
Share on other sites

Hello El Patron.

 

Thank you for reply.

 

I have several shops in my multistore.

 

My main website - www.myelectronicstore.com (This site i want to keep in the google index)

 

the other websites - www.cheapelectronics.com (remove from google because of duplicate content)

 

the other websites - www.electronic24/7.com (remove from google because of duplicate content)

 

The domain IS NOT the same, BUT product description is ALL the same.

 

I made a big mistake, by thinking this was a good idea.

 

I want to maintain all the "other website" domains, but remove it from the google index.

 

So i did this:

 

<meta name="robots" content="noindex,nofollow">

 

In every theme/multistorethemeiwantexcluded/header.tpl:

 

But NOT in the main site I want to keep.

 

 

Will google soner or later, remove the links from the exclusion site list?

 

 

And will this work, without chaning the robot.txt:

 

<meta name="robots" content="noindex,nofollow">

 

Thanks

Link to comment
Share on other sites

As i don't agree that will help you, I will continue with my localization experience, especially when it comes to multiplexing same content.

 

I don't see any point in 'removing'  from the index.  The reason is that nobody will find the 'other' shops....and since it's all about SEO, and I mean the organic portion (non-paid) SEO.

 

There is also no way presently to stop your sitemap from building the additional  domains, i.e. include/don't include in sitemap.

 

IMHO,  you are better served getting new url names and replacing your gTLD's.  Then in your  .htaccess put a 301 redirect from old duplicate gTLD to new ccTLD.  That will fix your search engine problem with the major search engines.

Link to comment
Share on other sites

×
×
  • Create New...