How to avoid search engines telling customers things you…

This morning I typed in Colorado into the search engine hoping to browse their lovely website and catalogue. I was absolutely shocked when I saw the second listing in Google:

“webstore is closing” – this was followed by a website which was an email folder which lead to a broken link.

google search for colorado

The website itself at present has no content, and you can’t browse to that folder. Google has slipped through the cracks of the website and has found some content that it feels is important. If I were Colorado I would be concerned as to what links are lurking in the rest of their website!

3 ways to search engine optimise your website so that spiders don’t find your skeletons

  1. Add the pages and folders you don’t want visible in search engines to your robots.txt file eg:
    Disallow: /admin-access/
    Disallow: /cms-login/
  2. Make sure that pages like CMS login, administrator access etc that should only be accessible to the webmaster are not linked to from anywhere on your site.
  3. Use your Google Webmaster tools account to submit a URL removal request

Related posts:

  1. How to build a website that will rank well in the search engines?
  2. What do the search engines say about John Howard vs Peter Costello?
  3. How to start a search engine marketing campaign
  4. Brand vs Search ROUND 1
  5. Include your Sitemap in your Robots.txt

  • Aurelius Tjin

    Thanks for the tips Fred. Spiders can really get nosy. πŸ™‚

Leave a Reply

Your email address will not be published. Required fields are marked *