Take Me to Your Leader…..

Robots of the early years were marvels of technology. Built like tanks with a minimal amount of electronics they were often relegated to a single task. Built in order to repeat the task millions of times there wasn’t much to break.  Programmed wrong they could do a lot of damage too.

Enter the web.  The robots text file was one of the very early pieces of a web site. Not glamorous and exciting to tinker with but built correctly would deliver years of service and literally millions of repeatable, correct functioning actions.

This small entry point into your web site by all (most) spiders is the first stop.  You either get it right or you get it wrong. Very binary, yes or no.  There is however a 3rd option, don’t have a robots text file.  Lets look at all 3 options and uncover what’s best for you.

Lets start with No Robots Text file.  Like a dirt road intersection there are no directions, signs or signals and perfectly OK.  If the Google spider came along it would be free to go in any direction in any speed and look at anything.  How would you tell how many times the intersection was crossed?  Without a Robots Text file, simple, look at how many 404 Errors you are getting!  (Sorry, 404 Error reporting not always available with Google Analytics (GA)) All spiders’ directives say they have to check the file to get a direction to move ahead.

From a SEO perspective, is delivering a 404 Error the first thing, a good thing?

Even if you want to have the entire site wide open to spidering with absolutely zero restrictions placing a green light in every direction is better than nothing and delivers a good first impression. Isn’t a 1 kb file worth reducing 1,000’s of 404 errors per year?

Robots Text file with red lights and stop signs and barricades. Why would anyone not want his or her site spidered?  For privacy, site sculpting or strategic reasons such as bandwidth issues and restriction for example all images from being spidered into the index. You can even tell the spiders to go away and not go 1 inch further.  Rare case but we’ve used that technique before to good success. 

A well-crafted Robots Text file will yield years of successful operation without maintenance.  Updates to this now antique piece of code allows you to include a XML file and direct the spiders to exactly the pages you wish and not make them wade through and hope to find your pages.

In the end the analysis of having, or not having a robots text file is clear. Having a good robots text file even if it includes directives to spider a part, all or none and reduction of 404 error codes is worth the 1-time effort. For those with a well-crafted file you can exclude some and open other areas and increased the opportunity to rank higher, quicker.

* Note of Caution: Beware of placing very private links in the robots text file to NOT be spidered. That creates an open invitation to “go have a look”.  Unless your web site links (every one of them, 100%) confirm the no index, no follow, no cache directives to and from this area and page/s it WILL get spidered.  Some times the best defense is to remain silent.

This entry was posted in Search Engine Optimization. Bookmark the permalink.