Wednesday, November 9, 2011

Why Brands should have profiles on Google Plus?

Just want to make you aware about news from Search world, and present key highlights from post written by Google Senior VP of Engineering - Vic Gundotra. 

Facebook and Google continue race for domination in digital world. Facebook advantage is the number of active users and access to incredible amount of data. Google advantage is people search behaviour and ability to rate credibilty and page rank of all websites. Google Plus was created to compete with FB and they have just announced 2 next steps which are extremely important from search perspective. Brand profiles and activities created in Google Plus network will be indexed faster - which gives huge advantage vs Facebook FanPage, where you can actually rank only with one properly optimized URL. It's an important change, and we should not ignore it. Google also introduced new Search tool called "Direct Connect", user will be able to add special feature "+" to the page he/she wants to "stay in touch" with and from now on will have direct connection. 



Why key brands should have profiles on Google Plus:

1. To own at least one more postition on Google Search Shelf and be when she/he is looking for us
2. To test all new features presented by Google
3. To be the leader of innovations in digital

You can access full article by clicking here: http://googleblog.blogspot.com/2011/11/google-pages-connect-with-all-things.html?spref=fb



Sunday, October 30, 2011

Understanding Customer Behavior through PPC data

You should always look for trends when you look at your web metrics data. Focusing on aggregate numbers just doesn't  make sense in any way.  For example knowing that I get 1000 unique visitors to my website in a month wont help me much.  But knowing where they come from, what time of the day, which days of the month (weekends or weekdays), what content they visit, how much they consume of it, is really meaningful.

Always focus on customer behavior rather than aggregate metrics. You can do some cool PPC analysis by applying that principle.
The setup is quite simple. Let's measure how long it takes for someone to convert from the first time they visit your website. The report is called Days to Purchase. Look at the figure below. You'll notice, top right, that it is segmented to show only Paid Traffic.

This data is for a travel website. It might strike you as odd that only 46 percent of the people make a purchase on the same day, simply because airline tickets and hotels and cruises all tend to get more expensive with each passing day. That's odd behavior for Visitors.
OK, let's find out whether we see the same behavior with traffic from other sources. Below figure shows the report for Direct Traffic.


Whoa! With Direct Traffic, 71 percent purchase on the same day. That is normal behavior for Visitors for a travel site. So, what is up with the PPC traffic? Why don't they behave like normal people? If the behavior from PPC traffic is so odd, then how can we treat them better or differently?
These are all great questions, and they're the reason why I recommend this type of analysis. Each traffic stream is unique!
In this case, the travel website took this data and reviewed the landing pages for its PPC campaigns. At the time of the previous report, it was geared toward converting Visitors quickly ("Buy now! Book now! Give us your money, now!").
The first action the marketers took was to soften the calls to action, because they realized a good chunk of the traffic does not want to buy right away. Then the marketers added a new feature, Save Your Itinerary. They realized Visitors would come back, so they might as well make it easy for them.
Finally, the marketers added another feature: "Email me if price goes up by x percent," where x was a number that the Visitors could input when saving an itinerary. This was a very clever move because the travel agency then had the contact information for the Visitors and could email them when the price went up by 10 percent or 20 percent or whatever number the customer input. This caused the customers to return and make a purchase sooner, and the customers were happier because they felt the site was watching out for them.
The net return for closely analyzing PPC customer behavior was that it brought forward Conversions to fewer days to purchase and tripled Conversion Rates. That's not too shabby for a simple segmented report, right?

Wednesday, October 26, 2011

Internal Site Search Analysis and points to consider


Almost all web analytics clicks data is missing one key ingredient: customer intent.


The keywords that people type into search engines such as Google, Bing, and Yahoo! contain a modicum of intent. The real gold is the search engine that you surely have on your website. What? No, you don't? You are the last website on the planet not to have an internal site search engine?


If you directly understand the intent of Visitors on your site, you can better understand the causes for success or failure on your site.


Here's an example. You can look at the top 10 most viewed pages on your site and understand what people who came to your site wanted. How would you know which pages your Visitors wanted to see? If Visitors can't find those pages, then your web analytics tool won't record that action.


One way to overcome the challenge of intent is to look at your internal site search data and see what customers typed into your site search engine. You should perform three clusters of actionable analysis with your internal site search data: site search usage, site search quality, and segmenting.

You need to create customised reports to understand the following:

· What customers typed into your site search engine?
o Report for internal site search keywords usage – broken down by each month
o Where there any seasonal effect or seasonal searches

· Your site Search Usage 
o How much is the search function used - You should find out, out of the total visitors coming to the site – how many visitors actually use this function?

· Internal Site Search Quality
o  Find out whether your site search engine delivers quality results
o Bounce Rate for site search
o Bounce Rate for each keyword searched
o The other way to think about search quality is to measure the number of search results pages that are viewed by the visits so we want  - Results Page Views/Search
o Time After Search (i.e. time spent on our website after doing the search)
o Search Depth (number of pages viewed after searching your site)
o % of Search Refinements (Search Refinements helps understand how Visitors refine their queries to get optimal results)

That wasn't so hard, was it? You can do three simple and effective types of analysis on one of the most valuable sources of data in your possession. So why wait? open your web analytics tool and start creating your customized reports.


Sunday, October 23, 2011

Points to consider when optimising for Google Panda

Google loves awesome content. They also love quality search results. They love big money. Bad search results unfortunately means Google will earn less money as searches will not be relevant to the users query. So if your site is of a lower quality or lower user experience and you rank well, somebody visits your site through a Google search result and they are unhappy, this cost Google money. People are less likely to use Google for their queries and this makes Google sad. So to overcome this situation, Google made some update to their algorithm.

This new update is called Google Panda. So Google Panda is an algorithm change which looks at a handful of changes:

a) Do you have thin or otherwise low quality content?

b) Do you have duplicate content either onsite or cross site?

c) Do you have too many advertisements or affiliate links and their ratio is higher compared to the amount of content on your site?

d) Do you have links from devalued domains?

e) Do you have slow loading pages which impact user experience?

Above are a few things which Panda focuses on when deciding where to rank your sites. So at the end of the day Panda can be summarized in one word and that’s “Usability”. Knowing that Google will continue to look for sites which have the best user experience and will penalize sites which don’t factor the above points, your site will definitely not rank well in this new algorithm update if you don’t start addressing these issues. So revaluate your site and ask yourself the above questions. If answer is yes to anyone of them, you have some SEO work to do my friend.

Sunday, October 9, 2011

Why is SEO important for companies like P&G?

A friend and colleague recently asked me why SEO is important. I sent him to a series of articles I recently wrote that starts with the basics of “What is SEO“.

He read it, and came back asking the same question: “Your article says WHAT Search Engine Optimization is, but… I’m still not sure WHY it’s important, or I’ll rephrase why SEO is important for big companies like P&G. They spend millions of dollars marketing their products, so should they even bother!”
Now I was the one scratching my head. I thought the answer was obvious. But, maybe you have the same question. So I tried approaching the topic from his perspective, and realized there is more he may need to understand about marketing in general.
When a consumer decides to purchase a product there is a typical purchase cycle (see diagram).




Every marketing channel has a different role to play in this purchase cycle. At the highest level, there are 2 types of media; Creative and Directive.
Creative Media:
Creative types of media are intrusive, meaning the advertising is not user initiated.
Common types of creative media would be:
a) newspaper
b) radio
c) television
d) billboards
e) vehicle wraps
f) online advertising
f) etc.
This type of media has many purposes that for the most part revolve around awareness building, creating interest, brand or product positioning, or branding.
Directive Media:
Directive types of media on the other hand are most useful only once the consumer has been made aware of the product and has already recognized they have a need (made possible by creative media channels)
So in reality directive media is user initiated i.e. phase three in the diagram (evaluation!). It is during this stage that SEO is very important. The consumer has seen the ad on television or in print (or any other media), wants more information about it, and the next thing you know they are searching online.
The Synergies:
SEO has the ability to improve the effectiveness of all types of marketing campaigns. So if you’re going to advertise your product, make sure your website is well optimized before the TV channels start airing your ads. SEO has the ability to improve the effectiveness of every other type of marketing campaign, and equally important is the concept that all other media have the ability to positively impact results from SEO.
The Issue:
The problem with the creative media is that they work only as long as you are working. As soon as you stop investing in other media types, or stop paying for ads, the traffic to your website stops. That’s why SEO is so golden — when you stop working, the elements you have set in place keep working to attract new traffic to your site. And new traffic means potential income.
Question for you:
Would you advise your client to use marketing techniques that they have to continually work, or techniques that continually work for them?
Hopefully you choose the second. P&G is going to market their products anyway, right? Why not market in a way that will actually increase your website traffic, not just maintain the status quo.
Don’t fret, SEO isn’t the big mysterious monster people make it out to be. Our goal is to help marketers refine and expand their marketing efforts. It’s important to realise SEO is “ongoing” - it never stops. Many brands have yet to realize the consequences of going dark in search. When managing budget challenges, stopping search activities may seem like a low risk decision, however, pausing search support can have some major impacts on P&G’s business. “Going dark” in search essentially throws away the dollars previously invested and the success gained.

Sunday, July 31, 2011

How to integrate Email with SEO and PPC?

Ok, so I was in Geneva this month visiting my client and we got into a very interesting conversation - How to integrate Email with SEO and PPC marketing? The conversation started as we were discussing about integrated marketing and tighter control of similar messaging across marketing channels. It is ubiquitous knowledge that search and email are the most commonly used interactive channels. But can they be truly integrated? Do we have any examples of companies doing this successfully? Most importantly, is this simple, low risk??

Firstly, ask yourself this question - why integrate Email and Search? To me, there are many reasons for this -

(a) You will have synchronized messaging across channels which creates a consistent cross-channel experience
(b) Most marketers already use both channels so why not use them in tandem
(c) Email and Search are likely managed by close colleagues. This allows for easy co-planning.

So how do we integrate Email with SEO and PPC?

Your PPC campaigns should drive email registrations. Drive paid-search clicks to landing pages that capture email addresses and contact information. Having specific landing pages linked to different keyword categories will give you rich information about how and why a customer is visiting your Web site. For example, Coca-Cola turns a branded search query into an action that drives registrations for its My Coke Rewards loyalty program. (See Figure below)


Use PPC data to inform email strategies. Search data contains valuable information about customer intent that you can use to help target email content. You can customise your follow up emails based on the keyword ad that drives in a lead.

Use Email content to improve your website SEO. Marketers can repurpose email newsletters into regularly updated, keyword-rich site content, which search engines love. Boost your natural results by posting past issues of email newsletter to your site.

Test your offer value and copy on both Email and PPC. Seeing what message content, creative, or offers perform best in search can inform email executions and vice versa. Use customer search data to understand which ads appeal to different personas. Then customizes email communication streams based on that information.

Coordinate your campaign's message on both channels. Timing keyword buys with email messaging will promote email content and reinforce brand messages in multiple media. Crate and Barrel coordinates seasonal email promotion copy with its keyword investment. (See Figure below)


Integrated marketing has long been a goal of marketers. But it’s a good idea that rarely happens. I strongly believe you should start small to coordinate efforts. Starting small will requires simple changes to current programs with little incremental cost and can set interactive marketers up for broader integration.

Thursday, June 30, 2011

What is HTML, CSS and Javascript?

Right, quick post today...on a topic I feel will get your web development basics clear. Before diving into SEO techniques it is important to know what is HTML, CSS and Javascript? I have gone through countless checklists while trying to understand search engine optimization and every checklist says you should know the very minimal of HTML, how to code a basic website and so on. I'm not going to focus on the development bit, but what I do want to explain is how HTML, CSS and Javascript work together in web pages.

The three main technologies used to create web pages (HTML, CSS and JavaScript) each do different jobs.

  • HTML is used only for structuring content.
  • Cascading Style Sheets is used for applying all visual styles.
  • JavaScript is used for (almost) all interactive functionality, and should always be referenced in separate files, never written into HTML
How do these three technologies work together?

The web page you see in your browser may be a combination of structure, style and interactivity.

These jobs are undertaken by three different technologies, HTML, Javascript, and CSS which your browser knows how to interpret

HTML - marks the content up into different structural types, like paragraphs, blocks, lists, images, tables, forms, comments etc

CSS - tells the browser how each type of element should be displayed, which may vary for different media (like screen, print or handheld device)

JavaScript- tells the browser how to change the web page in response to events that happen (like clicking on something, or changing the value in a form input)


The 2 most important things to learn about web page production are:

1. Use the correct HTML tags that mean what they say. (When people talk about "semantic" HTML, this is what they mean!)

2. Keep your HTML, CSS and JavaScript code separate.(This simply means: Don’t put JavaScript or CSS in your HTML. Put them in separate files, called in by your HTML pages)

If you're interested to learn how to code your website or blog, I'd suggest the Basic HTML tutorial from w3schools, look no further.

Monday, June 13, 2011

Best Practice SEO Web Design Standards 2011

SEO is all about precision. For the most part I look at SEO like an orchestra, sure, each instrument can be played on their own, but the more you get better at optimizing your content, technical code and link building, the greater the impact. All of the SEO elements work together to frame the focus of the page, improving the SEO rankings.

This post is about the Best Practice SEO Web Design Standards. I would like to profusely thank Ania Franczak from Performance Media who has given me permission to publish her findings in this post. Below are the SEO standards to follow when designing or redesigning a website.

Highly Recommended Standards

Do not use flash as one object containing a site, provide alternative HTML content with ‘progressive enhancement methodology’.

Use JavaScript as an enhancement to the basic, fully functional HTML page.

Enhance page only for capable browsers.

Do not use session ids and dynamic parameters in URLs.

Do not use frames.

Do not create duplicate content.

Create & maintain a site map for users and a separate compliant Google XML sitemap and submit it to the Google Webmaster Tools (and other search engine tools)

Link to the user site map from every page.

Keep URLs short; use keywords in URLs; keep URLs in local language.

Limit number of directory levels to 2 or less.

Page weight (in KB) should be as small as possible

Optimize page opening time. Use Page Speed browser extension and Page Speed Online

Use hosting and web analytics reporting solutions – Urchin, Google Analytics, Omniture, Webtrends, or Unica (you can use any depending upon your budget. Note – Google Analytics is a free tool. Rest all are paid tools)

Plan SEO-compliant migration for redesign - use 301(permanent) redirects not 302 (temporary) redirect, use 404 response code for not existing pages and 503 response code when site is temporarily unavailable

Use country in top level domain names, e.g. rediff.co.in, yahoo.co.uk. except for the US then use .com. Do not combine countries in one site.

If the website needs to be mainly in Flash, make sure it is SEO-compliant using Progressive Enhancement providing alternative HTML content

The best way to develop a site using rich content (Flash, JavaScript/Ajax, CSS) is to build it as a basic (safe) HTML site and enable rich content with Progressive Enhancement methodology

Test cookies to ensure they do not limit Search Engine crawler access.

Ensure robots.txt file allows Search Engine crawlers into website.

Use Google Webmaster Tools to monitor problems encountered by search engine robot

Recommended Standards

Create a preferred landing page (PLP) with relevant content for each of your GOLD keywords.

Do not create microsites, unless the purpose is or is part of an SEO strategy.

Use keywords in title tags.

Build unique and descriptive titles up to 70 characters.

Hide database driven site details from URLs.

Do not use subdomains for new content on a site, use subdirectories instead.

Use keywords in body text, use keywords in header tags (especially in H1 tag)

Use keywords in meta descriptions, use keywords in image alt tags

Limit number of directory levels to 3 or less

Emphasize keywords with bold (stron tag) & italics (em tag)

Create a 12 month refresh plan to update the site

Ensure country selectors are search friendly ( allow crawlers into website) and safe (don’t allow crawlers to index content from another locale)

Place important content closer to the site root and prevents spiders from seeing duplicate content. (use 301 redirects, meta robots tag/robots.txt file and rel=”canonical” tag)

Use a CMS that allows SEO modifications to dramatically reduce SEM costs.

Failure to comply with these standards will result in poor search engine rankings & inability to remedy the issue without expensive, time-consuming, & unplanned re-work.


How to submit your URL to Search Engines?

In this post I will show you how to submit your site URL to big three search engines. Usually, search engines web crawlers can find most of the pages on the web. However, if you have just launched your site or your site does not appear in your desired search engine, go on and use this process. This wont take much of your time and will finish in a jiffy!

For Google - Type http://www.google.com/addurl/ in your browser. You will see the below page

Fill in all the details and click on Add URL. Write a unique site description in the comments section. Dont make it too long. Keep it under 70 characters (including spaces)

For Yahoo - Type http://search.yahoo.com/info/submit.html in your browser. You will see the below page


Click on "Submit your site for free" then you will see the below screen - click on "Submit a website or webpage"

Then click on "Submit a Website or Webpage"


For Bing - Type http://www.bing.com/webmaster/SubmitSitePage.aspx in your browser. You will see the below page


Fill in the funny characters, add your home page URL and click on "Submit URL"

To summarize, search engines find the web pages they index by using software to follow links on the Web. Since the Web is huge, and always expanding and changing, it can be a while before this software finds your particular site. Therefore, it’s smart to speed up this process by manually submitting your site to search engines.

Wednesday, June 8, 2011

What is a Robots.txt file?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. Often, I get quizzical looks during my training sessions when I talk on this subject. Why would you not want search engines to index all the content of your site?

For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://www.supersavvyme.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

Here are some quick things to remember:

Location: This file, which must be named "robots.txt", is placed in the root directory of your site.

URL Address: The url of your robots.txt should always be http://www.yoursite.com/robots.txt

Example:








In the above screengrab you will see the webmaster has allowed all search engines to crawl the XML sitemap of the site. I will explain in detail -

User-agent” are search engines' crawlers
Allow: / lists the files and directories which the webmaster would like all search engines to crawl
* denotes all compliant search engine bots

You can also have the syntax disallow which will indicate to search engines the files and directories to be excluded from indexing. Below is an example:






The above screen grab means - All compliant search engine bots (denoted by the wildcard * symbol) shouldn't access and crawl the content under /images/ or any URL whose path begins with /search

If you do want to prevent search engines from crawling your pages, Google Webmaster Tools has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain.

For more information on robots.txt, I suggest this Webmaster Help Center guide on using robots.txt files

10 important tools for SEO and PPC

There are many tools and products on the web that can help you better understand your site and improve your search ranking. This post will assist you in configuring your website(s) for search engine optimization within the different search engines.

1) Google Adwords: Keyword Tool

What is this tool used for? - The Google Adwords Keyword Tool provides a list of related words or phrases for a specific website or keyword. This tool helps you to choose relevant and popular terms related to your selected key term.

Simply enter a key term and this tool will query information from several Google searches and will generate related words or phrase.

URL: https://adwords.google.com/select/KeywordToolExternal

2) Google Adwords Traffic Estimator

What is this tool used for? - This tool helps in finding out the volumes of searches being performed in Google for specific keywords and also calculates the bid amount for these keywords as well.

URL: https://adwords.google.com/select/TrafficEstimatorSandbox

3) Google Insights for Search

What is this tool used for?The tool offers a comprehensive set of statistics based on search volume and patterns. You can compare seasonal trends, geographic distributions, and category-specific searches, and you can group all these variables together to get extremely specific.

URL: http://www.google.com/insights/search/#

4) Google Trends

What is this tool used for?It is a tool similar to Google insights (though less advanced!). It allows the user to compare the volume of searches between two or more terms. An additional feature of Google Trends is in its ability to show news related to the search-term overlaid on the chart, showing how new events affect search popularity.

URL: http://www.google.com/trends

5) PageRank Lookup*

What is this tool used for? - It helps in reporting the PageRank of a URL and shows a historical view of previous PR scores.

URL:

http://www.seomoz.org/toolbox/pagerank

6) Domain Age Check*

What is this tool used for? - This tool helps you find the age of your domain and when it was first spidered by the search engines.

URL: http://www.seomoz.org/toolbox/age


7) Check HTTP status code*


What is this tool used for?If you have redirected your old domain to a new domain and would like to check if the developer has implemented this correctly then use this tool. 301? 302? This tool will find out what HTTP status code a website is producing.


8) Keyword Density Tool

What is this tool used for? - The keyword density tool is useful for helping webmasters and search engine optimizers achieve their optimum keyword density for a set of key terms. Keyword density is the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page.


9) GeoTargeting Detection*

What is this tool used for? - This tool will help you determine the hosting location of a website. It will tell you where your servers are physically located with your hosting company. Also, some search engines (such as Google) have the ability to filter search results based on their physical location (geotargeting). This could be used to determine why your site is showing in a certain country. This tool determines how well a site is targeted to region-specific search engines.


10) Backlinks Lookup

What is this tool used for? – This tool reports the number of backlinks to a URL and also shows a historical view of previous backlinks.



*Requires Login

Tuesday, June 7, 2011

How to use Google Webmaster tools?

In this post, I’m going to write about a tool which can help you improve the indexability of your website and ensure your site is ranked in Google. Very often, my colleagues have asked me “What is this Google Webmaster tools?” “What does it really do?” How to use it?


Before I really get into the tool, I want you to understand why Google made this tool? Not all websites are coded in the same way. Some use a lot of Javascript, some are flash based, some use frames and so on. Google Webmaster tools tell the webmaster how to build better sites for Google’s search engine.


It is virtually impossible for Google to crawl the entire web. So to make things easy, they also thought of a way wherein people can submit their sites and Google then goes and crawl’s the site.


It’s a win-win situation for both, they get to know about your site and you get a chance to enter into their database. Google Webmaster Tool is a free toolset provided by Google that helps you first understand what’s going on with your website. This way you make decisions based off of data and make your site a search engine’s indexing dream. That’s not just good business for Google, it’s good for your website too.

So now coming back to our main topic how do we use Google Webmaster Tools and what should we focus on?


Fist thing to remember is that this is a free tool. So like any other service you will need a Google Account. If you have a current Google Analytics account, you can use that too!


As soon as you sign in, you will see the ‘Home’ page where you will be asked to add your site. Remember, you can add up to 500 sites in your account. Once you add your site, you will need to perform what Google calls “site verification” for each site submitted. This proves to Google that you are the owner of the site before they release detailed information to you. If you are not the webmaster then ask your agency to perform this step. It fairly simple and can be done in 4 ways. These are:


· You add a DNS record to your domain's configuration or

· Link to your Google Analytics account or

· Upload an HTML file to your server ­or

· Add a meta tag to your site's home page

Once your site has been verified, you’ll have access to a whole suite of useful tools. Click on your site and you will be taken to the ‘Dashboard’. There are 4 sections to the left:

1. Site Configuration (Information about your site)

2. Your Site on the Web (Google data re: your site)

3. Diagnostics (Any problems Google had while indexing your site)

4. Labs (your site performance and video sitemap submission)


These sections are true eye openers and they help you see your site through the eyes of a search engine. If you have to prepare a report, focus on the following:

Dashboard – This is more like a summary page. It gives you a rough overview of everything from what keywords you are ranking for to how much traffic you are getting. In addition to that you’ll see if the Google bot is experiencing any crawl errors when going through your website, the number of sites linking to yours, and how many pages Google has indexed

Site Configuration

Sitemap – This section can be found under ‘Site Configuration’. Submitting a sitemap will help Google determine what pages you have on your website so they can index them. If you don’t submit a sitemap they may not index all of the pages on your website, which means you won’t get as much traffic.

(If you would like to learn - how to create a sitemap then click on the link!)

Crawler Access - There will be some pages on your website that you just don’t want Google to index. This could be private login areas, RSS feeds, or crucial data that you don’t want people accessing. By creating a robots.txt file you can block not just Google, but all search engines from accessing web pages that you don’t want them to get their hands on. There is also a tab which helps you in generating a robots.txt for your site. If you see that Google is indexing any of your private data you can also ask for removal of those URLs.


SitelinksSitelinks are links to a site's interior pages. Not all sites have sitelinks, but as you grow in popularity you’ll naturally get them. Google generates these links automatically, but you can remove sitelinks you don’t want.

Change of Address – This is a very useful tool. If you have recently swapped domains or you are looking to change the URL of your website, you better let Google know or else your traffic is going to decrease

Settings – This section is used for setting up a Geotargeting for your website. This means that your website is for users of a particular region. For example – I have a .com website but it is intended for an audience in UK, I will go to the Settings section and choose my preferred location as UK. This will ensure when people in UK search for my site its shows up in the UK database. You can also use this section to choose how you want to display your domain name. The reason for picking one is that people may link to different versions of your domain (e.g. abc.com or www.abc.com or www.abc.com/home.php) and by selecting one Google will combine the links, which will help your rankings.

You can even adjust your crawl rate. If you feel that the Google bot needs to be crawling your website more often and faster then you can tell them to do so. Note – keeping the default crawl rate is preferable as if Google crawls your website too often it can cause too much bot traffic going to your server and increase your hosting costs.

Your site on the web

Have you ever wondered how Google looks at your website? Through GWT you can see how Google views your website. After all, it’s a search engine and not a human… so naturally it won’t be able to look at a website in the same way you do.

Search Queries - I personally love this section! It tells you all the keywords which have brought traffic to your website. How many times where they searched? How many times clicked? If you have recently optimised your site then you can see how your keywords are performing. You can monitor the keyword performance % change too.

If you have not yet optimised your site but would like to do so then you can look at this data and make your title tag and meta description more attractive as that is what people read before clicking through to your site

Links to your site - This is another very useful section. Since linking is given 50% weightage by Google while analyzing your site and what rank to position it on its SERPS, this page will help you monitor who you are linking to. Remember while quantity of links is important, it is quality which matters the most. How is your site linked? What anchor texts are used? Who you link to is found under this tab.

Keywords - You may have a good idea of what keywords you want to rank for, but that may not be consistent with what Google is ranking you for. Under the keywords section you can see what keywords your website is the most related to.

You can also see what variations of each keyword those are also relevant to your website. For example some people misspell keywords and you can find out which misspellings your website is most relevant for.

Internal Links – This section tells you how your site is linked internally. If you don’t link to your internal pages, they will not get as much PageRank and they won’t place as well in the search listings. For example, if you want your about page to rank for your company name make sure you link to it multiple times. In addition to that, this data will also help you determine which pages Google feels is the most important.

Subscriber stats - If you have a blog, this area of GWT will be useful for you. If you don’t, it won’t.

Under the subscriber stats section you can see which of your blog posts are the most subscribed to by Google’s feed reader. You can then take that data and write more posts that are similar to your popular ones. And of course, you can stop writing blog posts similar to the ones that no one subscribed to, as readers probably didn’t enjoy them as much.

Diagnostics

Through the diagnostics section, you can figure out what’s wrong with your site and how you can fix it. Websites are made by humans, so don’t expect them to be perfect. Your code maybe a bit messed up, and even worse, your website may contain malware.

Malware - If you have malware on your server, you should see a message here. If you don’t, Google Webmaster Tool won’t show you much.

Crawl errors - The crawl errors section will show you if there any problems that relate to your site on the web or on a mobile phone. The most common error that you’ll see is a 404 error, which means Google’s bot can’t find that page

Crawl Stats - If you have thousands of pages on your site, then you should expect Google to crawl most of these pages on a daily or weekly basis. If they aren’t, then something is wrong.

Through the graphs and data tables that GWT provides, you should be able to get a good sense if they are crawling enough pages on your website. If they aren’t, consider adjusting the crawl rate under the settings tab.

Fetch as Googlebot – This feature lets you submit pages of your site and get real-time feedback on what Googlebot sees. This feature will help you a great deal when you re-implement your site with a new technology stack, find out that some of your pages have been hacked, or want to understand why they're not ranking for specific keywords.

HTML Suggestions - When Googlebot crawls your site, it may find some issues with your content. These issues won’t prevent your site from appearing in Google search results, but addressing them may help boost your traffic.

The most common problem is related to title tags and meta descriptions.

Labs

Google regularly tests new features out. The easiest way to find out about these new features is to go through the labs sections.

Instant Preview - Instant Previews are page snapshots that are displayed in search results. In general, Google generates these preview images at crawl time. If the Instant Preview looks different from what users see, it could indicate Google is having difficulty crawling your page.

Use this tool to compare an Instant Preview with your live page. This is a great tool as it instantly shows you the fetching errors.

Site Performance - This section shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users. Numerous website owners have seen a positive increase in their traffic by improving their website load time

Video Sitemaps - If you have video on your site, you want to make sure you include those raw video files in your sitemap. This way, Google can index them as they may not be able to find them otherwise.

This will help ensure that your videos are getting the traffic they deserve from Google video search.

Conclusion

If you aren’t making use of Google Webmaster Tools, you should start doing so now. The reason it’s worth using is because it will help guide you and tell you what to do if you want to improve your search engine traffic.