Skip to main content

Backlink Analysis Part I: Robots.txt Can Invalidate Your Backlinks

You work hard, contact so many professional A list bloggers out there, and suggest your post for incoming link. If they provide you with a backlink, you will be happy. A high PageRank, unpaid link, given with the free will of the giver, from the most relevant page/category is going to be a million dollar vote to your page. It can itself get your page to skyrocket from SERP 400 to SERP 3.

Yesterday, I had a discussion in the Digital Point Forums about backlinks and their validity. It seems that most people are not knowledgeable about backlink validity analysis. People know only about DoFollow and NoFollow, and nothing above that. Here, we will see the importance of ‘robots.txt’ file in the backlink analysis. Robots.txt is a simple link invalidation secret several professional bloggers won’t share with you.

What Is Robots.txt

When a search crawler accesses a website for searching and indexing, the first thing it looks for is the Robots.txt file. If it doesn’t find one, it goes about normally indexing the page.

A robots.txt file is a small text file on the root directory of the domain of a web page that directs search engine crawlers as to which of the pages or sections of the website shouldn’t be indexed. Its general format is thus:

User-agent: Googlebot
Disallow: /links.html

The above code simply disallows the Google bot from accessing the links page of a website (whose relative path is /links.html). This means, if you are an advertiser and you exchanged links with this person, who has disallowed the links page, your backlink from that page holds no value whatever. But it will definitely look like a DoFollow backlink to the untrained eyes. The search bots will normally index any pages not mentioned in the Robots.txt file.

User-agent: *
Disallow: /page/categories.htm

This code disallows the page for all search bots, not only Google’s. The wildcard, ‘*’ is used to specify that all search bots are to follow the rule. So, none of the search bots will index the mentioned page.

A careful, clever advertiser thinks from the search bot point of view and first looks for the Robots.txt file of the pages he plans to purchase links from. He merely won’t purchase backlink from any disallowed page.

Checking Robots.txt File of Any Website

You can easily check the Robots.txt file of any website out there. It’s right there in the root directory of the domain. It is named that way, ‘robots.txt’. You may just access it through the browser. For instance, if you need to access the Robots.txt file of, just go to the main home page of Microsoft, and put this on the address bar:


Google: “”

Simple? You will notice that these sites have disallowed a lot of internal pages from being indexed. This is why these internal backlinks do not show up in search results or hold any weight.

Through robots.txt file, you can even specify the sitemap of a website. If you check my robots.txt file, “”, you will see that a sitemap has been specified.


Before you purchase links from any website for SEO (if you are purchasing at all) or going for link exchange, first look for the robots.txt file and see if your links are going to get any weight at all. If not, that link exchange simply won’t work.

By the way, this article or this blog does not recommend purchasing backlinks for the purpose of SEO. If you are purchasing backlinks, it should be for traffic and the links should be Nofollow. Purchasing backlinks for SEO can get your site banned by Google easily.

In the next article of Link Analysis series, we will see another important thing to check for before exchanging links: The Robots Meta tag. Subscribe and enjoy.

Related Entries:

What is DoFollow? What is NoFollow (Make Your Blog DoFollow)
Ten Effective Link Building Techniques
Eleven Nasty Ways to Build Links

Copyright © Lenin Nair 2008


  1. Hai Lenin,

    Being a blogger can we do any thing with robot.text file ?

    Is there any control for blogger on this file ?

    Let me know.

  2. Suresh, Unless you are using Blogger self-hosting, there is no control over Robots.txt. Thanks for commenting.

  3. thanks for sending me a note about this post! i really learned something new today!

  4. What a readable, informative and educating post - always great to learn something new - thanks for sharing your knowledge!

  5. This is certainly an informative article. I "dugg" it!!!

  6. This morning I was researching robot.text file. The Google Tools, etc. website has lots of great info on this topic. Later I discovered this outstanding blog and it has helped me immensely.

    I'm concerned about an odd backlink to some of my youtube videos. It's some kind of fake website about insurance and it has 2 thumbnails of my videos that have nothing to do with insurance. When I clicked the thumbnails I was brought to 2 videos that were not mine. In fact they belonged to 2 competitors in my field.

    For some of my vids this is the only backlink, one backlink. I cannot seem to get my legitimate backlinks to connect to these vids. Do you think someone is screwing with robot.text files to block real backlinks from my vids? Why are these people using my thumbnails on a fake insurance website backlink to direct the next link to their videos?

    I am looking at your blog to see how I can subscribe to this excellent content. Thank you, you have made my day!


  7. Chuck, no I don think someone may be doing anything with robots.txt. It's only one backlink right? You can try building more links to your vids. Good to know you liked the resource and decided to subscribe. Since they are youtube videos, no one has access to the robots.txt file within youtube, so don't worry. I don't get why you say you can't connect your legitimate links to these videos. Can you give specifics so that I can help?


  8. thank's for writing an essential seo article, somthing which I, and probably other people have completely forgotten about; eg robots.txt file.

  9. Thank you for explaining that. I am using a program for my website and it comes with a pre-made robots txt page and I wasn't sure what it meant. Now I know that it isn't allowing certain pages to be indexed. Fascinating. Thanks again.


Post a Comment

Comments are moderated very strictly

Popular posts from this blog

En Dash, Em Dash, and Hyphen

We have three types of dashes in use: The hyphen, En Dash, and the Em Dash. In this post, we will see how to use them all correctly. Hyphen (-) The hyphen is the minus key in Windows-based keyboards. This is a widely used punctuation mark. Hyphen should not be mistaken for a dash . Dash is different and has different function than a hyphen. A hyphen is used to separate the words in a compound adjective, verb, or adverb. For instance: The T-rex has a movement-based vision. My blog is blogger-powered. John’s idea was pooh-poohed. The hyphen can be used generally for all kinds of wordbreaks . En Dash (–) En Dash gets its name from its length. It is one ‘N’ long (En is a typographical unit that is almost as wide as 'N'). En Dash is used to express a range of values or a distance: People of age 55–80 are more prone to hypertension. Delhi–Sidney flight was late by three hours. In MS Word, you can put an En Dash either from the menu, clicking Insert->Symbol or by the k

4 Effective Ways to Write About a Boring Topic

  With the plethora of interesting topics to write about, you’re fortunate enough to get the “boring” one. While it can be a pain for many writers to wind up with such a task, I’m telling you now there are ways to make yours more interesting than it is. So if you find yourself stuck with the dreariest topic to fill in a blog about, don’t fret. Here are the four best ways to unburden yourself. 1. Never a boring topic, only a boring writer. Here’s the hard fact: It’s never about the topic being boring. It’s about the writer making it boring. For instance, you’re supposed to write about aquariums. I know, how can you continuously make this topic interesting, right? Well, you’d be surprised just in how many ways you can make it an enticing read. Start by listing down the basic “what”, “where”, “when” and “how” surrounding the topic. You can ask (and research) about “What material was first used to make aquariums?” or even “How the first aquarium was built?” or “What are

How to Remove Duplicate Content From Your Blogger Blog Posts to Avoid SERP Penalty

Duplicate content to an extent may not affect your blog’s search engine rankings. However, there are quite a few times when it can go out of control and start to hit your rankings badly, even without your knowledge. Here are ways to curb it. What Is Duplicate Content Do you have a blog in which you post regularly? Do any two different URLs in that blog have the same content? Then it is duplication. In case of self-hosted blogs, various features like print preview pages, monthly archive pages, category pages, etc., can cause duplicate content . In such cases, normally search engines rank one of the pages lower. However, in extreme cases, when your blog has a number of pages with the same content, the blog can be penalized. Google puts it: In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. As a res