Skip to main content

Posts

Showing posts with the label SEO

What Is URL Canonicalization (URL Normalization)? And What Are Its Effects on SEO?

Have you ever noticed that when you type in www.twitter.com in the address bar, after Twitter loads, it will be changed to http://twitter.com ? In the same way, if you type in bbc.co.uk in address bar, after BBC gets loaded, the URL will be http://www.bbc.co.uk . If you try to browse to microsoft.com you will get an entirely different URL in the form: "http://www.microsoft.com/en/us/default.aspx" . You find that certain websites strip off your 'www' prefix, while certain others add it. Yet another set of websites redirect you to entirely different looking URLs. In this post, we will see more about this aspect of search engine optimization, called URL canonicalization or URL normalization. We will see why it is important and why being knowledgeable about it will reduce duplicate content issues . Need for Canonicalization As you have seen above, the following URLs may all be the same in terms of content: http://website.com http://www.website.com/index.html h...

My Recommendations for Google to Help Them Fight Web Spam in Their Index

[This Is an Important Post for You; See the Conclusion] As the days pass by, web spam is on the rise . Even Google's inimitable algorithm to rank the worthiest content seems to fail at times. It's why we miss a lot of great information due to the mere lack of enough backlinks or funds to place advertisements, and get a lot of spammy links on the first few search result pages. With the world's most popular search engine at their hand, Google is an information hub. It has been given the ultimate power by God to make or break any business today. That's why spending a lot of money on this search engine is more and more important. As Uncle Ben says (and so far one of my most loved quotes): "With Great Power Comes Great Responsibility", Google has an enormous responsibility on their shoulders. In this case, helping them improve their index and remove more spam content from the web is a responsibility for every individual. Here are my recommendations to Google an...

Yahoo Search Engine Optimization Guidelines Presentation

Here is a presentation I prepared detailing the basics of Yahoo search engine optimization. Major guidelines as to the content quality, links, sitemap submission, etc., are covered in this tiny presentation. We can look deeper into optimizing a site for Yahoo in another post. Make sure you post your comments. Here goes the presentation: You can use this code to embed this presentation to your blog/website: <div style="width:425px;text-align:left" id="__ss_741470"><object style="margin:0px" width="425" height="355"><param name="movie" value="http://static.slideshare.net/swf/ssplayer2.swf?doc=yahoo-search-engine-optimization-guidelines-1226410556572049-8&stripped_title=yahoo-search-engine-optimization-guidelines-presentation" /><param name="allowFullScreen" value="true"/><param name="allowScriptAccess" value="always"/><embed src="htt...

Search Engine On Page Optimization Guide: Major Factors in On-page Website Optimization That Can Decide Your Site's Ranking on Search Results

On-page website optimization is as important as the off-page techniques, such as building links to your site and doing social media promotion. On-page search engine optimization includes these factors: content optimization, title and Meta description optimization, keyword research and analysis, outbound links, the HTML code of the website, internal link structure, update frequency, etc . We will see all of these factors in detail in this post. The Blog Content Optimization First and foremost thing in on-page website optimization is of course the content optimization. Google webmaster guidelines tells all webmasters to prepare content for readers, and not for search engines: Make pages primarily for users, not for search engines. Don't deceive your users or present different content to search engines than you display to users, which is commonly referred to as "cloaking." There is quite a bit of speculation floating in the blogosphere as to how to copywrite for SEO. Pe...

How to Remove Duplicate Content From Your Blogger Blog Posts to Avoid SERP Penalty

Duplicate content to an extent may not affect your blog’s search engine rankings. However, there are quite a few times when it can go out of control and start to hit your rankings badly, even without your knowledge. Here are ways to curb it. What Is Duplicate Content Do you have a blog in which you post regularly? Do any two different URLs in that blog have the same content? Then it is duplication. In case of self-hosted blogs, various features like print preview pages, monthly archive pages, category pages, etc., can cause duplicate content . In such cases, normally search engines rank one of the pages lower. However, in extreme cases, when your blog has a number of pages with the same content, the blog can be penalized. Google puts it: In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. As a res...

Enable Individual Meta Descriptions to Different Blogger Blog Posts

A blogger should well understand the importance of enabling Meta description tags to their blogs. In self-hosted blogs, this is pretty easy as it is a matter of editing the head content. However, in Blogger blogs, you can normally edit the Meta tags only for the home page, and not for individual posts. If you enable Meta tag on home page, it will be duplicated over all your posts, causing you to have duplicate Meta tags (Google Webmaster Tools can indicate this). Here is a trick with which you can have individual Meta description tags for each of your posts in Blogger. Since adding Meta description to every individual post is rather difficult (particularly when you have hundreds of blog posts), you can make the post title itself act as the Meta description. It will get you individual description tag for each post, and it is very good for the site’s SEO. Make sure that you have a very descriptive title for each post. That helps to rank high in search results. Follow these steps: S...

Google Translation Widget for Blogs & W3C Validation Rules

Translating your blog text can be a great way to attract more users from the world. If you have an internationally relevant blog, and you have traffic mainly from the English-speaking countries, then you definitely have to add a translation widget to your blog. And the best one available out there is definitely Google Translate. W3C Valid Google Translate Code The problem with Google translate code is that it doesn’t validate to W3C Consortium’s coding standards . So, it can induce some errors in your page, if you add it directly. Having a valid code definitely helps in terms of SEO. So, here I have edited the code to make it valid: Copy and paste this code to your blog’s sidebar to have a W3C valid Google Translate widget: <script src=http://www.gmodules.com/ig/ifr?url=http://www.google.com/ig/modules/translatemypage.xml&amp;up_source_language=en&amp;w=160&amp;h=60&amp;title=&amp;border=&amp;output=js></script> Bookmark this page and always c...

Search Engines Not Indexing Your Internal Pages? Here Is a Trick You Can Do About It

Some bloggers have a problem— only their blog’s home page is indexed in search engines and not any internal pages (or not all internal pages) . You may submit a sitemap in Google, Yahoo, and in Live Webmaster Tools for the site’s internal pages to be indexed. You may wish to check which of your pages are actually indexed by using the ‘site’ operator : Search in Google and Live.com: site:http://yoursitename.com This tells you how many pages from your site are actually there in the index of the search engine. If only a few of them are indexed, you have to either manually submit each page or submit a sitemap with all pages ( How to Submit a Sitemap for Blogger? ) Google Indexing Google actually indexes all your pages by following links . It is quite easy to be found by Google, since it normally indexes a page if there is a relevant link coming into it. And as the PageRank increases, the indexing becomes faster. Which means, your newer pages will be found even without incoming l...

Google Webmaster Tools Helps Improve Link Popularity By Reporting Broken Links to Your Site

This is the latest news from Google Webmaster Tools, and is great news for all webmasters. Now, you can check which sites point to non-existent pages on your site directly from Google Webmaster Tools. When somebody tries to access a non-existent link on your site, they may get a 404-error page on your site. Most of the time, the reason behind this is a site linking to one of your articles made a mistake in the URL. As your popularity increases, the number of incoming links increase. Also, do increase sites that give you broken links. Webmasters make errors in the URLs every now and then. In order to fix this, you may just log into Google Webmaster Tools now. Go to Diagnostics->Web Crawl. Here you will see errors on the link structure. It shows URLs that are ‘Not found’. Just click it and you will see this: See the marked regions. It means 4 external pages link to non-existent URL on my site. Just click it and you can see which sites link. Just go ahead, contact the webmaster...

Internal Link Building for Traffic Growth

Bloggers know well the importance of traffic building to their pages. Normally, when a blogger has a number of links incoming to his blog, his PageRank goes up . And this rise in PageRank causes the newer pages of the blog be indexed faster and rank higher in search results even without enough incoming links. Still, there are pages that get very high traffic on your blog, and those which don’t get enough traffic. We will see why. The main reason why a page within your blog doesn’t get enough traffic may be the lack of enough incoming links. Every blogger realizes pages in his blog that they rate as their masterpieces. They will repeatedly link to these pages from their newer posts. This sort of frenzied internal link building causes these pages to rise high in search results. At the same time, there are standalone pages within your blog that gets linked from no other post. These pages lurk at the very bottom of traffic stats with very few daily visits. Here is what you should do t...

Google Webmaster Central Bespoke Exactly What I Expected of Outbound Links From Your Blog Posts

If you didn’t know, there is the link week going on in Google Webmaster Central blog, in which they discuss various aspects of link building , link selection , internal links, etc. Today, I found something corroborating to my findings: That external links to valuable related resources is good in terms of SEO and site rankings. Thoughtful outbound links can help your credibility. * Show that you've done your research and have expertise in the subject manner * Make visitors want to come back for more analysis on future topics * Build relationships with other domain experts (e.g. sending visitors can get you on the radar of other successful bloggers and begin a business relationship) So said this post . This is what I expected to see for a long time. Here are the points: 1. Search engine values the importance of your blog posts. 2. Search engine rates the importance according to the content’s usefulness 3. Relevant links add to content’s usefulness pretty well. So, ...

Google Sitelinks: How Google Calculates Sitelinks? How You Can Modify Them?

Google has a beautiful feature not present in other major search engines—sitelinks. If you search for a popular website through Google, along with the site URL and description, Google also shows some internal links from the site and optionally also a search box, which searches within the site. These links are called Google sitelinks. Google calculates which pages to be included as sitelinks automatically. At least three pages are shown as sitelinks in search results. Here we will see how you can have sitelinks and how you can modify which one to show and which one not to. How Does Google Find Relevant Sitelinks In order to find how Google finds which links to be rated as relevant sitelinks, I unleashed an experiment. First, I went to Google Webmaster Tools to see if I have any recognized sitelinks (to know how, read below). From Webmaster Tools, you can find if any sitelinks are recognized for your site. The site should be fairly old and content rich to get some sitelinks approve...

DMoz.org, the Open Directory Project Submission Guidelines

Webmasters and bloggers know well about DMoz.org or Directory Mozilla or more popularly known as the Open Directory Project, the premier web directory. And it is completely free as well. Getting listed within DMoz means a lot for a website, as it is a list of sites which are the absolute best in the Internet. Getting a listing in this user-moderated directory is extremely difficult. Also, it means a holy grail in Search Engine Optimization. Google and other search engines value DMoz very high. Here are some guidelines to get quickly accepted by DMoz. Imporatnce of the Open Directory directly from Google SEO specialist, Mr Matt Cutts, broadcast by the Google Channel. 1. Don’t Treat the Open Directory as Other Directories This is the first tip and most important. Many submissions are rejected because the person submitting the site regards DMoz as any other directory and submits without proper care. Understand that DMoz is the largest and most important of all directories (paid or...

Backlink Analysis Part II: Robots Meta Tag

In the first part of this guide, we saw how we can get good quality backlinks by checking the robots.txt file of a website . There is another important attribute to check too in web pages before you exchange or purchase links, Robots Meta Tag . Robots Meta tag is a tag similar to the Meta description and Meta keyword tags. Search bots look for this tag, once they have checked the Robots.txt file. Using Robots Meta tag, you can specify various things including, should your page be indexed, should the links on a particular page be followed, etc. An example of a Robots Meta tag is thus: <meta name=”robots” content=”index, follow”/> This Meta tag, as its content indicates, tells the bots to index the page and follow all links on it. Meta tags are placed in the head section of the page in between <head> and </head> tags. You can have one of the following contents on this tag. Here are the explanation of some of them. Index: Tells the bot to index the current p...

Backlink Analysis Part I: Robots.txt Can Invalidate Your Backlinks

You work hard, contact so many professional A list bloggers out there, and suggest your post for incoming link. If they provide you with a backlink, you will be happy. A high PageRank, unpaid link, given with the free will of the giver , from the most relevant page/category is going to be a million dollar vote to your page. It can itself get your page to skyrocket from SERP 400 to SERP 3. Yesterday, I had a discussion in the Digital Point Forums about backlinks and their validity. It seems that most people are not knowledgeable about backlink validity analysis. People know only about DoFollow and NoFollow , and nothing above that. Here, we will see the importance of ‘robots.txt’ file in the backlink analysis. Robots.txt is a simple link invalidation secret several professional bloggers won’t share with you. What Is Robots.txt When a search crawler accesses a website for searching and indexing, the first thing it looks for is the Robots.txt file. If it doesn’t find one, it go...

Why I Love Woopra? A Review of the Revolutionary Web Analytics Tool!

Woopra is a new entrant to the sprawling array of web analytics tools, led at the front by Google Analytics . Several innovative analytics tools are already there, such as Crazy Egg, with their click heat map overlay , Sitemeter, Feedjit, and Statcounter with live analytics , etc. But I couldn’t find one better-equipped than Woopra. Here we will see Woopra's features through various screenshots. Please click on the screenshot to enlarge. To learn more about web & digital data analytics, visit ZoomOwl . Woopra is currently in its Beta stage , and provides free site traffic analytics . Right now, the sites allowed should not have more than ten thousand visits daily . The program doesn’t simply count any visit above that limit. Also, signing up on Woopra may well take a long time to wait, since they are clogged with new sign ups. I recently got approved by them for Beta testing. Woopra is loaded with features, not found in normal analytics tools. Unlike other analytics t...