Our robots.txt is set to "Allow" and not sure why I still get error messages on over 200 already optimized pages. I can't figure out. Created a ticket about 3 days ago. I am afraid Giggle might drop the website altogether if not fixed soon and hope Flynax support can fix it asap.
Has anyone had such warnings from Giggle. Also what else can I do besides the robots.txt file which is already set to (Allow). Since more than 1800 other pages are okay in the site but the error belongs to about 204 pages scenario is confusing and beyond my expertise and giggle wants me to fix them at the earliest.