+ Reply to Thread
Results 1 to 4 of 4

Thread: Giggle Search Console Error and Warning

  1. #1
    Senior Member
    Join Date
    Feb 2019
    Location
    United States
    Posts
    101

    Giggle Search Console Error and Warning

    Our robots.txt is set to "Allow" and not sure why I still get error messages on over 200 already optimized pages. I can't figure out. Created a ticket about 3 days ago. I am afraid Giggle might drop the website altogether if not fixed soon and hope Flynax support can fix it asap.
    Has anyone had such warnings from Giggle. Also what else can I do besides the robots.txt file which is already set to (Allow). Since more than 1800 other pages are okay in the site but the error belongs to about 204 pages scenario is confusing and beyond my expertise and giggle wants me to fix them at the earliest.

  2. #2
    Senior Member
    Join Date
    Feb 2019
    Location
    United States
    Posts
    101
    Flynax, Any help, please. I have created a ticket as well on the 7th.

  3. #3
    Flynax developer Rudi's Avatar
    Join Date
    Dec 2014
    Location
    Planet Earth
    Posts
    3,138
    Hello Javed,

    you have a ticket with a massive errors list that should be examined

    you'll get a reply soon

    and as far as I can tell, the main reason is that you remove listings which were indexed by google

  4. #4
    Senior Member
    Join Date
    Feb 2019
    Location
    United States
    Posts
    101
    Thank you Rudi and I will wait but there were test ads during the time when I had the site under construction mode and I had activated robots.txt set at disallow. Then when the site was ready, I deleted the test ads..and changed it from "Disallow" to "Allow" . The site was taken out of "under construction" mode. Also I did have a lot of test Ads during the setup phase. So is it possible to delete the "Indexed Test Ads" from Google Console? and have a fresh start?.

    Right now in Google Console it's still showing that robots.txt is blocking some 204 Pages even though I have it set at "Allow". I am afraid there is another robots.txt folder which is blocking the pages. Not that I know what I am talking about but just a guess. So please do check on the server side if you can find something which might help.

    Thank you.
    Last edited by J Koresh; September 16, 2019 at 06:55 PM.

+ Reply to Thread