Thursday, June 30, 2011

What is HTML, CSS and Javascript?

Right, quick post today...on a topic I feel will get your web development basics clear. Before diving into SEO techniques it is important to know what is HTML, CSS and Javascript? I have gone through countless checklists while trying to understand search engine optimization and every checklist says you should know the very minimal of HTML, how to code a basic website and so on. I'm not going to focus on the development bit, but what I do want to explain is how HTML, CSS and Javascript work together in web pages.

The three main technologies used to create web pages (HTML, CSS and JavaScript) each do different jobs.

  • HTML is used only for structuring content.
  • Cascading Style Sheets is used for applying all visual styles.
  • JavaScript is used for (almost) all interactive functionality, and should always be referenced in separate files, never written into HTML
How do these three technologies work together?

The web page you see in your browser may be a combination of structure, style and interactivity.

These jobs are undertaken by three different technologies, HTML, Javascript, and CSS which your browser knows how to interpret

HTML - marks the content up into different structural types, like paragraphs, blocks, lists, images, tables, forms, comments etc

CSS - tells the browser how each type of element should be displayed, which may vary for different media (like screen, print or handheld device)

JavaScript- tells the browser how to change the web page in response to events that happen (like clicking on something, or changing the value in a form input)


The 2 most important things to learn about web page production are:

1. Use the correct HTML tags that mean what they say. (When people talk about "semantic" HTML, this is what they mean!)

2. Keep your HTML, CSS and JavaScript code separate.(This simply means: Don’t put JavaScript or CSS in your HTML. Put them in separate files, called in by your HTML pages)

If you're interested to learn how to code your website or blog, I'd suggest the Basic HTML tutorial from w3schools, look no further.

Monday, June 13, 2011

Best Practice SEO Web Design Standards 2011

SEO is all about precision. For the most part I look at SEO like an orchestra, sure, each instrument can be played on their own, but the more you get better at optimizing your content, technical code and link building, the greater the impact. All of the SEO elements work together to frame the focus of the page, improving the SEO rankings.

This post is about the Best Practice SEO Web Design Standards. I would like to profusely thank Ania Franczak from Performance Media who has given me permission to publish her findings in this post. Below are the SEO standards to follow when designing or redesigning a website.

Highly Recommended Standards

Do not use flash as one object containing a site, provide alternative HTML content with ‘progressive enhancement methodology’.

Use JavaScript as an enhancement to the basic, fully functional HTML page.

Enhance page only for capable browsers.

Do not use session ids and dynamic parameters in URLs.

Do not use frames.

Do not create duplicate content.

Create & maintain a site map for users and a separate compliant Google XML sitemap and submit it to the Google Webmaster Tools (and other search engine tools)

Link to the user site map from every page.

Keep URLs short; use keywords in URLs; keep URLs in local language.

Limit number of directory levels to 2 or less.

Page weight (in KB) should be as small as possible

Optimize page opening time. Use Page Speed browser extension and Page Speed Online

Use hosting and web analytics reporting solutions – Urchin, Google Analytics, Omniture, Webtrends, or Unica (you can use any depending upon your budget. Note – Google Analytics is a free tool. Rest all are paid tools)

Plan SEO-compliant migration for redesign - use 301(permanent) redirects not 302 (temporary) redirect, use 404 response code for not existing pages and 503 response code when site is temporarily unavailable

Use country in top level domain names, e.g. rediff.co.in, yahoo.co.uk. except for the US then use .com. Do not combine countries in one site.

If the website needs to be mainly in Flash, make sure it is SEO-compliant using Progressive Enhancement providing alternative HTML content

The best way to develop a site using rich content (Flash, JavaScript/Ajax, CSS) is to build it as a basic (safe) HTML site and enable rich content with Progressive Enhancement methodology

Test cookies to ensure they do not limit Search Engine crawler access.

Ensure robots.txt file allows Search Engine crawlers into website.

Use Google Webmaster Tools to monitor problems encountered by search engine robot

Recommended Standards

Create a preferred landing page (PLP) with relevant content for each of your GOLD keywords.

Do not create microsites, unless the purpose is or is part of an SEO strategy.

Use keywords in title tags.

Build unique and descriptive titles up to 70 characters.

Hide database driven site details from URLs.

Do not use subdomains for new content on a site, use subdirectories instead.

Use keywords in body text, use keywords in header tags (especially in H1 tag)

Use keywords in meta descriptions, use keywords in image alt tags

Limit number of directory levels to 3 or less

Emphasize keywords with bold (stron tag) & italics (em tag)

Create a 12 month refresh plan to update the site

Ensure country selectors are search friendly ( allow crawlers into website) and safe (don’t allow crawlers to index content from another locale)

Place important content closer to the site root and prevents spiders from seeing duplicate content. (use 301 redirects, meta robots tag/robots.txt file and rel=”canonical” tag)

Use a CMS that allows SEO modifications to dramatically reduce SEM costs.

Failure to comply with these standards will result in poor search engine rankings & inability to remedy the issue without expensive, time-consuming, & unplanned re-work.


How to submit your URL to Search Engines?

In this post I will show you how to submit your site URL to big three search engines. Usually, search engines web crawlers can find most of the pages on the web. However, if you have just launched your site or your site does not appear in your desired search engine, go on and use this process. This wont take much of your time and will finish in a jiffy!

For Google - Type http://www.google.com/addurl/ in your browser. You will see the below page

Fill in all the details and click on Add URL. Write a unique site description in the comments section. Dont make it too long. Keep it under 70 characters (including spaces)

For Yahoo - Type http://search.yahoo.com/info/submit.html in your browser. You will see the below page


Click on "Submit your site for free" then you will see the below screen - click on "Submit a website or webpage"

Then click on "Submit a Website or Webpage"


For Bing - Type http://www.bing.com/webmaster/SubmitSitePage.aspx in your browser. You will see the below page


Fill in the funny characters, add your home page URL and click on "Submit URL"

To summarize, search engines find the web pages they index by using software to follow links on the Web. Since the Web is huge, and always expanding and changing, it can be a while before this software finds your particular site. Therefore, it’s smart to speed up this process by manually submitting your site to search engines.

Wednesday, June 8, 2011

What is a Robots.txt file?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. Often, I get quizzical looks during my training sessions when I talk on this subject. Why would you not want search engines to index all the content of your site?

For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://www.supersavvyme.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

Here are some quick things to remember:

Location: This file, which must be named "robots.txt", is placed in the root directory of your site.

URL Address: The url of your robots.txt should always be http://www.yoursite.com/robots.txt

Example:








In the above screengrab you will see the webmaster has allowed all search engines to crawl the XML sitemap of the site. I will explain in detail -

User-agent” are search engines' crawlers
Allow: / lists the files and directories which the webmaster would like all search engines to crawl
* denotes all compliant search engine bots

You can also have the syntax disallow which will indicate to search engines the files and directories to be excluded from indexing. Below is an example:






The above screen grab means - All compliant search engine bots (denoted by the wildcard * symbol) shouldn't access and crawl the content under /images/ or any URL whose path begins with /search

If you do want to prevent search engines from crawling your pages, Google Webmaster Tools has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain.

For more information on robots.txt, I suggest this Webmaster Help Center guide on using robots.txt files

10 important tools for SEO and PPC

There are many tools and products on the web that can help you better understand your site and improve your search ranking. This post will assist you in configuring your website(s) for search engine optimization within the different search engines.

1) Google Adwords: Keyword Tool

What is this tool used for? - The Google Adwords Keyword Tool provides a list of related words or phrases for a specific website or keyword. This tool helps you to choose relevant and popular terms related to your selected key term.

Simply enter a key term and this tool will query information from several Google searches and will generate related words or phrase.

URL: https://adwords.google.com/select/KeywordToolExternal

2) Google Adwords Traffic Estimator

What is this tool used for? - This tool helps in finding out the volumes of searches being performed in Google for specific keywords and also calculates the bid amount for these keywords as well.

URL: https://adwords.google.com/select/TrafficEstimatorSandbox

3) Google Insights for Search

What is this tool used for?The tool offers a comprehensive set of statistics based on search volume and patterns. You can compare seasonal trends, geographic distributions, and category-specific searches, and you can group all these variables together to get extremely specific.

URL: http://www.google.com/insights/search/#

4) Google Trends

What is this tool used for?It is a tool similar to Google insights (though less advanced!). It allows the user to compare the volume of searches between two or more terms. An additional feature of Google Trends is in its ability to show news related to the search-term overlaid on the chart, showing how new events affect search popularity.

URL: http://www.google.com/trends

5) PageRank Lookup*

What is this tool used for? - It helps in reporting the PageRank of a URL and shows a historical view of previous PR scores.

URL:

http://www.seomoz.org/toolbox/pagerank

6) Domain Age Check*

What is this tool used for? - This tool helps you find the age of your domain and when it was first spidered by the search engines.

URL: http://www.seomoz.org/toolbox/age


7) Check HTTP status code*


What is this tool used for?If you have redirected your old domain to a new domain and would like to check if the developer has implemented this correctly then use this tool. 301? 302? This tool will find out what HTTP status code a website is producing.


8) Keyword Density Tool

What is this tool used for? - The keyword density tool is useful for helping webmasters and search engine optimizers achieve their optimum keyword density for a set of key terms. Keyword density is the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page.


9) GeoTargeting Detection*

What is this tool used for? - This tool will help you determine the hosting location of a website. It will tell you where your servers are physically located with your hosting company. Also, some search engines (such as Google) have the ability to filter search results based on their physical location (geotargeting). This could be used to determine why your site is showing in a certain country. This tool determines how well a site is targeted to region-specific search engines.


10) Backlinks Lookup

What is this tool used for? – This tool reports the number of backlinks to a URL and also shows a historical view of previous backlinks.



*Requires Login

Tuesday, June 7, 2011

How to use Google Webmaster tools?

In this post, I’m going to write about a tool which can help you improve the indexability of your website and ensure your site is ranked in Google. Very often, my colleagues have asked me “What is this Google Webmaster tools?” “What does it really do?” How to use it?


Before I really get into the tool, I want you to understand why Google made this tool? Not all websites are coded in the same way. Some use a lot of Javascript, some are flash based, some use frames and so on. Google Webmaster tools tell the webmaster how to build better sites for Google’s search engine.


It is virtually impossible for Google to crawl the entire web. So to make things easy, they also thought of a way wherein people can submit their sites and Google then goes and crawl’s the site.


It’s a win-win situation for both, they get to know about your site and you get a chance to enter into their database. Google Webmaster Tool is a free toolset provided by Google that helps you first understand what’s going on with your website. This way you make decisions based off of data and make your site a search engine’s indexing dream. That’s not just good business for Google, it’s good for your website too.

So now coming back to our main topic how do we use Google Webmaster Tools and what should we focus on?


Fist thing to remember is that this is a free tool. So like any other service you will need a Google Account. If you have a current Google Analytics account, you can use that too!


As soon as you sign in, you will see the ‘Home’ page where you will be asked to add your site. Remember, you can add up to 500 sites in your account. Once you add your site, you will need to perform what Google calls “site verification” for each site submitted. This proves to Google that you are the owner of the site before they release detailed information to you. If you are not the webmaster then ask your agency to perform this step. It fairly simple and can be done in 4 ways. These are:


· You add a DNS record to your domain's configuration or

· Link to your Google Analytics account or

· Upload an HTML file to your server ­or

· Add a meta tag to your site's home page

Once your site has been verified, you’ll have access to a whole suite of useful tools. Click on your site and you will be taken to the ‘Dashboard’. There are 4 sections to the left:

1. Site Configuration (Information about your site)

2. Your Site on the Web (Google data re: your site)

3. Diagnostics (Any problems Google had while indexing your site)

4. Labs (your site performance and video sitemap submission)


These sections are true eye openers and they help you see your site through the eyes of a search engine. If you have to prepare a report, focus on the following:

Dashboard – This is more like a summary page. It gives you a rough overview of everything from what keywords you are ranking for to how much traffic you are getting. In addition to that you’ll see if the Google bot is experiencing any crawl errors when going through your website, the number of sites linking to yours, and how many pages Google has indexed

Site Configuration

Sitemap – This section can be found under ‘Site Configuration’. Submitting a sitemap will help Google determine what pages you have on your website so they can index them. If you don’t submit a sitemap they may not index all of the pages on your website, which means you won’t get as much traffic.

(If you would like to learn - how to create a sitemap then click on the link!)

Crawler Access - There will be some pages on your website that you just don’t want Google to index. This could be private login areas, RSS feeds, or crucial data that you don’t want people accessing. By creating a robots.txt file you can block not just Google, but all search engines from accessing web pages that you don’t want them to get their hands on. There is also a tab which helps you in generating a robots.txt for your site. If you see that Google is indexing any of your private data you can also ask for removal of those URLs.


SitelinksSitelinks are links to a site's interior pages. Not all sites have sitelinks, but as you grow in popularity you’ll naturally get them. Google generates these links automatically, but you can remove sitelinks you don’t want.

Change of Address – This is a very useful tool. If you have recently swapped domains or you are looking to change the URL of your website, you better let Google know or else your traffic is going to decrease

Settings – This section is used for setting up a Geotargeting for your website. This means that your website is for users of a particular region. For example – I have a .com website but it is intended for an audience in UK, I will go to the Settings section and choose my preferred location as UK. This will ensure when people in UK search for my site its shows up in the UK database. You can also use this section to choose how you want to display your domain name. The reason for picking one is that people may link to different versions of your domain (e.g. abc.com or www.abc.com or www.abc.com/home.php) and by selecting one Google will combine the links, which will help your rankings.

You can even adjust your crawl rate. If you feel that the Google bot needs to be crawling your website more often and faster then you can tell them to do so. Note – keeping the default crawl rate is preferable as if Google crawls your website too often it can cause too much bot traffic going to your server and increase your hosting costs.

Your site on the web

Have you ever wondered how Google looks at your website? Through GWT you can see how Google views your website. After all, it’s a search engine and not a human… so naturally it won’t be able to look at a website in the same way you do.

Search Queries - I personally love this section! It tells you all the keywords which have brought traffic to your website. How many times where they searched? How many times clicked? If you have recently optimised your site then you can see how your keywords are performing. You can monitor the keyword performance % change too.

If you have not yet optimised your site but would like to do so then you can look at this data and make your title tag and meta description more attractive as that is what people read before clicking through to your site

Links to your site - This is another very useful section. Since linking is given 50% weightage by Google while analyzing your site and what rank to position it on its SERPS, this page will help you monitor who you are linking to. Remember while quantity of links is important, it is quality which matters the most. How is your site linked? What anchor texts are used? Who you link to is found under this tab.

Keywords - You may have a good idea of what keywords you want to rank for, but that may not be consistent with what Google is ranking you for. Under the keywords section you can see what keywords your website is the most related to.

You can also see what variations of each keyword those are also relevant to your website. For example some people misspell keywords and you can find out which misspellings your website is most relevant for.

Internal Links – This section tells you how your site is linked internally. If you don’t link to your internal pages, they will not get as much PageRank and they won’t place as well in the search listings. For example, if you want your about page to rank for your company name make sure you link to it multiple times. In addition to that, this data will also help you determine which pages Google feels is the most important.

Subscriber stats - If you have a blog, this area of GWT will be useful for you. If you don’t, it won’t.

Under the subscriber stats section you can see which of your blog posts are the most subscribed to by Google’s feed reader. You can then take that data and write more posts that are similar to your popular ones. And of course, you can stop writing blog posts similar to the ones that no one subscribed to, as readers probably didn’t enjoy them as much.

Diagnostics

Through the diagnostics section, you can figure out what’s wrong with your site and how you can fix it. Websites are made by humans, so don’t expect them to be perfect. Your code maybe a bit messed up, and even worse, your website may contain malware.

Malware - If you have malware on your server, you should see a message here. If you don’t, Google Webmaster Tool won’t show you much.

Crawl errors - The crawl errors section will show you if there any problems that relate to your site on the web or on a mobile phone. The most common error that you’ll see is a 404 error, which means Google’s bot can’t find that page

Crawl Stats - If you have thousands of pages on your site, then you should expect Google to crawl most of these pages on a daily or weekly basis. If they aren’t, then something is wrong.

Through the graphs and data tables that GWT provides, you should be able to get a good sense if they are crawling enough pages on your website. If they aren’t, consider adjusting the crawl rate under the settings tab.

Fetch as Googlebot – This feature lets you submit pages of your site and get real-time feedback on what Googlebot sees. This feature will help you a great deal when you re-implement your site with a new technology stack, find out that some of your pages have been hacked, or want to understand why they're not ranking for specific keywords.

HTML Suggestions - When Googlebot crawls your site, it may find some issues with your content. These issues won’t prevent your site from appearing in Google search results, but addressing them may help boost your traffic.

The most common problem is related to title tags and meta descriptions.

Labs

Google regularly tests new features out. The easiest way to find out about these new features is to go through the labs sections.

Instant Preview - Instant Previews are page snapshots that are displayed in search results. In general, Google generates these preview images at crawl time. If the Instant Preview looks different from what users see, it could indicate Google is having difficulty crawling your page.

Use this tool to compare an Instant Preview with your live page. This is a great tool as it instantly shows you the fetching errors.

Site Performance - This section shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users. Numerous website owners have seen a positive increase in their traffic by improving their website load time

Video Sitemaps - If you have video on your site, you want to make sure you include those raw video files in your sitemap. This way, Google can index them as they may not be able to find them otherwise.

This will help ensure that your videos are getting the traffic they deserve from Google video search.

Conclusion

If you aren’t making use of Google Webmaster Tools, you should start doing so now. The reason it’s worth using is because it will help guide you and tell you what to do if you want to improve your search engine traffic.