Tuesday, May 19, 2009

Create Robots.txt File To Allow All Search Engine Robots To Crawl Your Website

Share |



By using Robots.txt file you can allow all search engine robots to crawl your website. You have to create robots.txt file and upload it to your site's top-level directory.

Step by step guide on how to create and upload robots.txt file

1. Open notepad.

2. Copy and paste the following code in your notepad;

User-Agent: *
Allow: /

3.
Name the file robots.txt and save it.

4. Now upload robots.txt to your site's top-level directory.

Example - http://www.your~blog~name.com/robots.txt

Please Note - Change this URL " http://www.your~blog~name.com " with your site's URL

Your robots.txt file should look like this - http://earn-money-fast.web.officelive.com/robots.txt

  • In blogger.com you don't need to create robots.txt file because blogger.com automatically creates robots.txt file for every blog.

0 comments:

Post a Comment

 

Copyright © 2009 Super Seo Tips. All Rights Reserved. Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com