-
robots.txt
Hi all,
After adding disallow: /plugins/ into robots.txt, google console/google index/Blocked resource telling that there are 22 pages blocked resource. when I check the pages on Blocked resource, apperas to me that all of the 22 pages blocked are actually listings on my website!!!! I was shocked!!!
Any one experiencing same isseu please?
Thank you
-
Hello Brian,
Would you please show us text form the robots.txt file.
-
Hi Viktor,
This is what I have on my robots.txt
User-Agent: *
Allow: /
Crawl-delay: 20
Disallow: /plugins/
Any solution please?
-
Hello Brian,
Very strange, because the code looks like ok.
Also, would you please send me blocked page urls.
-
Hi Viktor,
I have sent you a PM. Please let me know
Thank you
-
Hello Everybody,
Use following code in your robots.txt file:
Code:
Only registered members can view the code.
-
Hello Viktor,
Could you please explain what the codes can do, because I am very confused now as Disallow: /plugins/ is blocking resources for other pages. In other words, pages using plugins have not been indexed properly as they are using blocked rescourses according to google console.
Thank you.
-
Hello, after adding disallow / pluggins from the recent thread > http://forum.flynax.com/showthread.p...-Google-Search < and now updating to this I am now confused as to the correct setting required ?
Thanks Brain for following this up, hope we can get the correct settings for all, for now going to reverse all updates until sorted properly.
-
Please check all .htaccess files in folders with correct ones in 4.6.2
-
Hello Pete,
For now, I have deleted (Disallow: /plugins/) from robots.txt and I will be waiting for the solution.
Thank you