rosegarden
June 23, 2012, 12:41 AM
I will recommend that everyone installed a robots.txt to block duplicate content, I am still improvising mine
=======================================
User-agent: *
Crawl-Delay: 25
Disallow: *.js
Disallow: /backup/
Disallow: /includes/
Disallow: /cron/
Disallow: /libs/
Disallow: /plugins/
Disallow: /tmp/
Disallow: /print.html
Disallow: /my-messages.html
Disallow: /my-profile.html
Disallow: /my-favorites.html
Disallow: /my-listings.html
Disallow: /my-packages.html
Disallow: /payment-history.html
Disallow: /payment.html
Disallow: /listings-by-field.html
Sitemap: http://www.xxx.com/sitemap.xml
=======================================
User-agent: *
Crawl-Delay: 25
Disallow: *.js
Disallow: /backup/
Disallow: /includes/
Disallow: /cron/
Disallow: /libs/
Disallow: /plugins/
Disallow: /tmp/
Disallow: /print.html
Disallow: /my-messages.html
Disallow: /my-profile.html
Disallow: /my-favorites.html
Disallow: /my-listings.html
Disallow: /my-packages.html
Disallow: /payment-history.html
Disallow: /payment.html
Disallow: /listings-by-field.html
Sitemap: http://www.xxx.com/sitemap.xml