Jump to content

No: blocked by robots.txt


Luigi Donato

Recommended Posts

Greetings,
I would like to make the tag links indexable, so I added the hash (#) to the rules concerned, then I added the same rules, but with Allow instead of Disallow.

The suspected lines are as follows:

Allow: /*?tag=
Allow: /*en/search?tag=
#Disallow: /*?tag=
#Disallow: /*en/search

Also I noticed there was an HTML tag:

<meta name="robots" content="{$page.meta.robots}">

So I created a rule to prevent it from appearing in the tag pages and I succeeded, but according to Google and also according to other sites, the pages are blocked by the robots.txt file..
Can anyone enlighten me?

Link to comment
Share on other sites

Everything looks right.

I think better to try to change SearchController.php function in controllers folder:

public function getTemplateVarPage()
    {
        $page = parent::getTemplateVarPage();

        $page['meta']['robots'] = 'noindex'; 

        return $page;
    }

 

Link to comment
Share on other sites

On 7/17/2022 at 11:08 AM, webprog said:

I have tested on my hosting and got the same result as you.

But after I request Google to refresh the robots.txt - google have added to index the search page. And now it is ok.

So you need to request Google to refresh robots.txt - https://www.google.com/webmasters/tools/robots-testing-tool?siteUrl=

Google don't refresh robots.txt everytime when you add link.

I noticed it too yesterday and I solved it as you said, thanks anyway

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...