Jump to content

[Solved] Google ignores Disallow: /*p= in robots.txt ?


Mister Denial

Recommended Posts

Hey everyone,

 

I just noticed that Google now ignores the Disallow: /*p= from my robots.txt file and now lists a whole bunch of duplicate meta tag errors in Webmaster tools.

 

I wonder how comes, and if anyone else is having the same issue. And what I could do to prevent this issue?

 

I recently upgraded to 1.4.9 and wonder if that is the issue?I am also using Tomerg's Presto-Changeo module "Duplicate URL Redirect".

 

Help would be much appreciated!

 

Regards,

 

Dan

 

EDIT: must have been a Google glitch, the robots.txt. file works again as it should

Edited by Mister Denial (see edit history)
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...