www.google.com/robots.txtUser-agent: * Disallow: /search Disallow: /sdch Disallow: /groups Disallow: /
Friday, August 03, 2012
Please sir ? I want some more.
The Searchlive search network has been engaged in a trio-directional bandwidth project and we have come to the conclusion that if you run the Google robot txt you will use up to 60% more bandwidth. Fact. now let them dispute.
images Disallow: /catalogs Allow: /catalogs/about Allow: /catalogs/p? Disallow: ..