Tag: work from home

Aug
01

Robots.txt – A Real Helper When Optimizing Your Site

by , under Internet Marketing

Today I’ll show you how to use the file robots.txt.

But first, let’s get why, in fact, we need the file robots.txt.

The robots.txt file is to specify the search of search engines which files and folders you do not want to be indexed. It should be located in the root directory of a site.

Very small html-sites, which contain 10-30 pages do not need a robots.txt file, as all the files of this site should be indexed.

In the case of large dynamic websites there is a set of special pages, it is not intended for publication, and to update the site requires a comfortable internal structure of the site and the active use of the file robots.txt.

The robots.txt file usually looks like this:

User-agent: *

Disallow: / delo.php

Disallow: / d123 /

Disallow: / travel / dat /

Line User-agent: specifies a particular robot or * – all robots. In the Disallow line the path to the folder or file specifically prohibited for indexing (the path is absolute, measured from the root site). To allow the robot to access some parts of the site or the entire site, use the directive Allow. Blank lines in between the User-agent string and Disallow, Allow should not be.

If you use the site map described with the format sitemaps.xml, and you want the robot to learn about it, specify the path to sitemaps.xml, as a parameter to the directive Sitemap (if multiple files, select all)

If your site has a mirror, a special robot mirror identifies them and forms a group of mirrors of your site. The search would include only the primary mirror. To do this you must specify the robots.txt, use the directive Host, defining it as a parameter of the name of the mirror. Directive Host, as search engine treats Help, does not guarantee a specified range of the primary mirror, however, the algorithm takes into account when deciding its high priority.

You can also specify the time the robot visits the page of your site, you can use this directive Crawl-delay. It allows you to specify a search robot minimum time (in seconds) between the end of injection of a single page and the beginning of the next injection. It is imperative to ensure compatibility with the robots, Crawl-delay directive should be added to the group, starting with the entry “User-Agent”, immediately after the directive Disallow (Allow).

The search engine supports fractional values Crawl-Delay, for example, 0.5. This does not guarantee that the crawler will go to your site every half second, but gives the robot more freedom and allows you to speed up the bypass site.

As you can see everything is easy and clear. Use it on you sites, thus it can help the search engine to correctly read your webpage.

Interested in making money online without investment? You are invited to go to this make money online site. It is the right place on the Internet where you can get info on how to earn cash online and arrange everything for it.

In addition, I would like to give another piece of advice. Currently the Internet technologies give you a really unique chance to select exactly what you require for the best price on the market. Strange, but most of the people don’t avail themselves of this chance. In real life it means that you must use all the tools of today to get the info that you need.

One more thing. The subject has become very popular recently. So search Google or other search engines. Visit various social networks and review topics which are respective to yours. Go to the niche forums and participate in the online discussion. All this will help you to create a concept of such kind of work.

And with that we would recommend you to sign up for the RSS on this blog since we will do everything possible to keep updating this blog with new publications about making money online without investment and other important issues.

Comments Off on Robots.txt – A Real Helper When Optimizing Your Siteread more
© Copyright Aubrey's Corner 2009. All rights reserved. | Powered by Wordpress | Designed by ThemesGuy