събота, 18 юни 2011 г.

SEO Tutorial

SEO Tutorial

Create unique, accurate page titles

Indicate page titles by using title tags

A title tag tells both users and search engines what the topic of a particular page is. The <title> tag should be placed within the <head> tag of the HTML document. Ideally, you should create a unique title for each page on your site.

Page title contents are displayed in search result

If your document appears in a search results page,  the contents of the title tag will usually appear in the first line of the results (if you’re unfamiliar with the different parts of a Google search result, you might want to check out the anatomy of a search result video by Google engineer Matt Cutts, and this helpful diagram of a Google search results page). Words in the title are bolded if they appear in the user’s search query. This can help users recognize if the page is likely to be relevant to their search.The title for your homepage can list the name of your website/business and could include other bits of important information like the physical location of the business or maybe a few of its main focuses or offerings.

Accurately describe the page’s contentchoosing a title that has no relation to the content on the pageusing default or vague titles like “Untitled” or “New Page 1″

Create unique title tags for each pageEach of your pages should ideally have a unique title tag, which helps Google know how the page is distinct from the others on your site.using a single title tag across all of your site’s pages or a large group of pages.
Use brief, but descriptive titlesTitles can be both short and informative. If the title is too long, Google will show only a portion of it in the search result.using extremely lengthy titles that are unhelpful to usersstuffing unneeded keywords in your title tags.

Make use of the “description” meta tag

Summaries can be defined for each page

A page’s description meta tag gives Google and other search engines a summary of what the page is about. Whereas a page’s title may be a few words or a phrase, a page’s description meta tag might be a sentence or two or a short paragraph. Google Webmaster Tools provides a handy content analysis section that’ll tell you about any description meta tags that are either too short, long, or duplicated too many times (the same information is also shown for <title> tags). Like the <title> tag, the description meta tag is placed within the <head> tag of your HTML document.

What are the merits of description meta tags

Description meta tags are important because Google might use them as snippets for your pages. Note that we say “might” because Google may choose to use a relevant section of your page’s visible text if it does a good job of matching up with a user’s query. Alternatively, Google might use your site’s description in the Open Directory Project if your site is listed there (learn how to prevent search engines from displaying ODP data). Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has an informative post on  improving snippets with better description meta tags.Words in the snippet are bolded when they appear in the user’s query. This gives the user clues about whether the content on the page matches with what he or she is looking for.It is another example, this time showing a snippet from a description meta tag on a deeper page (which ideally has its own unique description meta tag) containing an article.

Accurately summarize the page’s content

Writing a description meta tag that has no relation to the content on the page using generic descriptions like “This is a web page” or “Page about baseball cards” filling the description with only keywords copying and pasting the entire content of the document into the description meta tag
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result.Use unique descriptions for each pageusing a single description meta tag across all of your site’s pages or a large group of pages
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (e.g. searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn’t feasible. In this case, you could automatically generate description meta tags based on each page’s content.

Improve the structure of your URLs

Simple-to-understand URLs will convey content information easily

Creating descriptive categories and filenames for the documents on your website can not only help you keep your site better organized, but it could also lead to better crawling of your documents by search engines. Also, it can create easier, “friendlier” URLs for those that want to link to your content. Visitors may be intimidated by extremely long and cryptic URLs that contain few recognizable words.URLs like (1) can be confusing and unfriendly. Users would have a hard time reciting the URL from memory or creating a link to it. Also, users may believe that a portion of the URL is unnecessary, especially if the URL shows many unrecognizable parameters. They might leave off a part, breaking the link
Some users might link to your page using the URL of that page as the anchor text. If your URL contains relevant words, this provides users and search engines with more information about the page than an ID or oddly named parameter would.

URLs are displayed in search results

Lastly, remember that the URL to a document is displayed as part of a search result in Google, below the document’s title and snippet. Like the title and snippet, words in the URL on the search result appear in bold if they appear in the user’s query ( ). To the right is another example showing a URL on our domain for a page containing an article about the rarest baseball cards. The words in the URL might appeal to a search user more than an ID number like “www.brandonsbaseballcards.com/article/10 1 5/” would.
Google is good at crawling all types of URL structures, even if they’re quite complex, but spending the time to make your URLs as simple as possible for both users and search engines can help. Some webmasters try to achieve this by rewriting their dynamic URLs to static ones; while Google is fine with this, we’d like to note that this is an advanced procedure and if done incorrectly, could cause crawling issues with your site. To learn even more about good URL structure, we recommend this Webmaster Help Center page on  creating Google-friendly URLs

Use words in URLs

Using lengthy URLs with unnecessary parameters and session IDs choosing generic page names like “page1.html” using excessive keywords like”baseball-cards-baseball-cards-baseballcards.htm”

Create a simple directory structure

Use a directory structure that organizes your content well and makes it easy for visitors to know where they’re at on your site. Try using your directory structure to indicate the type of content found at that URL
URLs with words that are relevant to your site’s content and structure are friendlier for visitors navigating your site. Visitors remember them better and might be more willing to link to them.Provide one version of a URL to reach a documentAvoid:To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 01 redirect from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel=”canonical” link element if you cannot redirect.

Make your site easier to navigate

Navigation is very important for search engine
The navigation of a website is important in helping visitors quickly find the content they want. It can also help search engines understand what content the webmaster thinks is important. Although Google’s search results are provided at a page level, Google also likes to have a sense of what role a page plays in the bigger picture of the site.
Plan out your navigation based on your homepag
All sites have a home or “root” page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (e.g. root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages
Ensure more convenience for users by using ‘breadcrumb lists
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page (1). Many breadcrumbs have the most general page (usually the root page) as the first, left-most link and list the more specific sections out to the right.

Allow for the possibility of a part of the URL being remove

Consider what happens when a user removes part of your URL - Some users might navigate your site in odd ways, and you should anticipate this. For example, instead of using the breadcrumb links on the page, a user might drop off a part of the URL in the hopes of finding more general content. He or she might be visiting http://www.brandonsbaseballcards.com/news/ 010/upcoming-baseballcard-shows.htm, butthen enter http://www.brandonsbaseballcards.com/news/ 010/ into the browser’s address bar, believing that this will show all news from  010 ( ). Is your site prepared to show content in this situation or will it give the user a 404 (“page not found” error)? What about moving up a directory level to http://www.brandonsbaseballcards.com/news/?

Prepare two sitemaps: one for users, one for search engine

A site map (lower-case) is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it’s mainly aimed at human visitors.An XML Sitemap (upper-case) file, which you can submit through Google’s Webmaster Tools, makes it easier for Google to discover the pages on your site. Using a Sitemap file is also one way (though not guaranteed) to tell Google which version of a URL you’d prefer as the canonical one (e.g. http://brandonsbaseballcards.com/ or http://www.brandonsbaseballcards.com/; more on  what’s a preferred domain). Google helped create the open source Sitemap Generator Script to help you create a Sitemap file for your site. To learn more about Sitemaps, the Webmaster Help Center provides a useful guide to Sitemap files.

Offer quality content and services

Interesting sites will increase their recognition on their own
Creating compelling and useful content will likely influence your website more than any of the other factors discussed here(1). Users know good content when they see it and will likely want to direct other users to it. This could be through blog posts, social media services, email, forums, or other means.Organic or word-of-mouth buzz is what helps build your site’s reputation with both users and Google, and it rarely comes without quality content.

Anticipate differences in users’ understanding of your topic and offer unique, exclusive conten

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time baseball fan might search for [nlcs], an acronym for the National League Championship Series, while a new fan might use a more general query like [baseball playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google AdWords provides a handy Keyword Tool that helps you discover new keyword variations and see the approximate search volume for each keyword ( ). Also, Google Webmaster Tools provides you with the top search queries your site appears for and the ones that led the most users to your site.Consider creating a new, useful service that no other site offers. You could also write an original piece of research, break an exciting news story, or leverage your unique user base. Other sites may lack the resources or expertise to do these things

Write easy-to-read text

Users enjoy content that is well written and easy to follow.

Stay organized around the topic

It’s always beneficial to organize your content so that visitors have a good sense of where one content topic begins and another ends. Breaking your content up into logical chunks or divisions helps users find the content they want faster

Create fresh, unique content

New content will not only keep your existing visitor base coming back, but also bring in new visitors.

Create content primarily for your users, not search engine

Designing your site around your visitors’ needs while making sure your site is easily accessible to search engines usually produces positive result

Write better anchor text

Suitable anchor text makes it easy to convey the contents linke

Anchor text is the clickable text that users will see as a result of a link, and is placed within the anchor tag <a href=”…”></a>.This text tells users and Google something about the page you’re linking to. Links on your page maybe internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you’re linking to is about

Optimize your use of images

Image-related information can be provided for by using the “alt” attribut

Images may seem like a straightforward component of your site, but you can optimize your use of them. All images can have a distinct filename and “alt” attribute, both of which you should takeadvantage of. The “alt” attribute allows you to specify alternative text for the image if it cannot be displayed for some reason.
Why use this attribute? If a user is viewing your site on a browser that doesn’t support images, or is using alternative technologies, such as a screen reader,  the contents of the alt attribute provide information about the picture.Another reason is that if you’re using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don’t recommend using too many images for links in your site’s navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like  Google Image Search to better understand your images.

Store files in specialized directories and manage them using common file format

Instead of having image files spread out in numerous directories and subdirectories across your domain, consider consolidating your images into a single directory (e.g. brandonsbaseballcards.com/images/). This simplifies the path to your images.Use commonly supported filetypes – Most browsers support JPEG, GIF, PNG, and BMP image formats. It’s also a good idea to have the extension of your filename match with the filetype.

Use heading tags appropriately

Use heading tags to emphasize important text
Heading tags (not to be confused with the <head> HTML tag or HTTP headers) are used to present structure on the page to users. There are six sizes of heading tags, beginning with <h1>, the most important, and ending with <h6>, the least important (1).Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.

Make effective use of robots.txt

Restrict crawling where it’s not needed with robots.tx.
A “robots.txt” file tells search engines whether they can access and therefore crawl parts of your site (1). This file, which must be named “robots.txt”, is placed in the root directory of your site.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine’s search results. If you do want to prevent search engines from crawling your pages, Google Webmaster Tools has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you’ll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files.There are a handful of other ways to prevent content appearing in search results, such as adding “NOINDEX” to your robots meta tag, using .htaccess to password protect directories, and using Google Webmaster Tools to remove content that has already been crawled. Google engineer Matt Cutts walks through the caveats of each URL blocking method in a helpful video.
You shouldn’t feel comfortable using robots.txt to block sensitive or confidential material. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don’t acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don’t want seen. Encrypting the content or password-protecting it with .htaccess are more secure alternatives.

Be aware of rel=”nofollow” for links

Combat comment spam with “nofollow”.
Setting the value of the “rel” attribute of a link to “nofollow” will tell Google that certain links on your site shouldn’t be followed or pass your page’s reput ation to the pages linked to. Nofollowing a link is adding rel=”nofollow” inside of the link’s anchor tag.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you’re not giving your page’s hard-earned reputation to a spammy site.
Automatically add “nofollow” to comment columns and message board. Many blogging software packages automatically nofollow user comments, but those that don’t can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guestbooks, forums, shoutboards, referrer listings, etc. If you’re willing to vouch for links added by third parties (e.g. if a commenter is trusted on your site), then there’s no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on  avoiding comment spam, like using CAPTCHAs and turning on comment moderation.
Another use of nofollow is when you’re writing content and wish to reference a website, but don’t want to pass your reputation on to it. For example, imagine that you’re writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don’t want to give the site some of your reputation from your link. This would be a good time to use nofollow.Lastly, if you’re interested in nofollowing all of the links on a page, you can use “nofollow” in your robots meta tag, which is placed inside the <head> tag of that page’s HTML (4). The Webmaster Central Blog provides a helpful post on using the robots meta tag. This method is written as <meta name=”robots” content=”nofollow”>.

Notify Google of mobile sites

Configure mobile sites so that they can be indexed accuratel.
It seems the world is going mobile, with many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn’t easy. Mobile sites not only use a different format from normal desktop sites, but the management methods and expertise required are also quite different. This results in a variety of new challenges. While many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendl.

Verify that your mobile site is indexed by Google

1. If your web site doesn’t show up in the results of a Google mobile search even using the site: operator, it may be that your site has one or both of the following issues:1. Googlebot may not be able to find your siteGooglebot must crawl your site before it can be included in our search index. If you just created the site, we may not yet be aware of it. If that’s the case, create a Mobile Sitemap and submit it to Google to inform us of the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, just like a standard Sitemap.
2. Googlebot may not be able to access your siteSome mobile sites refuse access to anything but mobile phones, making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Our crawler for mobile sites is ”Googlebot-Mobile”. If you’d like your site crawled, please allow any User-agent including “Googlebot-Mobile” to access your site ( ). You should also be aware that Google may change its Useragent information at any time without notice, so we don’t recommend checking whether the User-agent exactly matches “GooglebotMobile” (the current User-agent). Instead, check whether the Useragent header contains the string “Googlebot-Mobile”. You can also use DNS Lookups to verify Googlebot.

Promote your website in the right ways

About increasing backlinks with an intention to increase the value of the site.
While most of the links to your site will be gained gradually, as people discover your content through search or other ways and link to it, Google understands that you’d like to let others know about the hard work you’ve put into your content. Effectively promoting your new content will lead to faster discovery by those who are interested in the same subject (1). As with most points covered in this document, taking these recommendations to an extreme could actually harm the reputation of your site.
Master making announcements via blogs and being recognized online. A blog post on your own site letting your visitor base know that you added something new is a great way to get the word out about new content or services. Other webmasters who follow your site or RSS feed could pick the story up as well.Putting effort into the offline promotion of your company or site can also be rewarding. For example, if you have a business site, make sure its URL is listed on your business cards, letterhead, posters, etc. You could also send out recurring newsletters to clients through the mail letting them know about new content on the company’s website.If you run a local business, adding its information to Google Placeswill help you reach customers on Google Maps and web search. The Webmaster Help Center has more tips on promoting your local business.

Make use of free webmaster tools

Make Googlebot crawling smoother by using Webmaster Tools.
Major search engines, including Google, provide free tools for webmasters. Google’s Webmaster Tools help webmasters better control how Google interacts with their websites and get useful information from Google about their site. Using Webmaster Tools won’t help your site get preferential treatment; however, it can help you identify issues that, if addressed, can help your site perform better in search results. With the service, webmasters can:
see which parts of a site Googlebot had problems crawling
notify us of an XML Sitemap file
analyze and generate robots.txt files
remove URLs already crawled by Googlebot
specify your preferred domain
identify issues with title and description meta tags
understand the top searches used to reach a site
get a glimpse at how Googlebot sees pages
remove unwanted sitelinks that Google may use in results
receive notification of quality guideline violations and request a site reconsideratio

High-level analysis is possible via Google Analytics and Website Optimizer.

If you’ve improved the crawling and indexing of your site using Google Webmasters Tools or other services, you’re probably curious about the traffic coming to your site. Web analytics programs like Google Analytics are a valuable source of insight for this. You can use these to: get insight into how users reach and behave on your site discover the most popular content on your site measure the impact of optimizations you make to your site   - e.g. did changing those title and description meta tags improve traffic from search engines?Yahoo! (Yahoo! Site Explorer) and Microsoft (Bing Webmaster Tools) also offer free tools for webmasters.For advanced users, the information an analytics package provides, combined with data from your server log files, can provide even more comprehensive information about how visitors are interacting with your documents (such as additional keywords that searchers might use to find your site).Lastly, Google offers another tool called Google Website Optimizer that allows you to run experiments to find what on-page changes will produce the best conversion rates with visitors. This, in combination with Google Analytics and Google Webmaster Tools (see our video on using the “Google Trifecta”), is a powerful way to begin improving your site.

Google Webmaster Help Forum

http://googlewebmastercentral.blogspot.com - Frequent posts by Googlers on how to improve your website.
Google Webmaster Central Blog
http://www.google.com/support/webmasters/ – Filled with in-depth documentation on webmaster-related issues.
Google Webmaster Help Center
https://www.google.com/webmasters/tools/ – Optimize how Google interacts with your website.
Google Webmaster Tools
http://www.google.com/webmasters/guidelines.html – Design, content, technical, and quality guidelines from Google.
Google Webmaster Guidelines
http://www.google.com/analytics/ – Find the source of your visitors, what they’re viewing, and benchmark changes.
Google Analytics
http://www.google.com/websiteoptimizer/ – Run experiments on your pages to see what will work and what won’t.
Google Website Optimizer
http://www.google.com/support/webmasters/bin/answer.py?answer= 5 91 – If you don’t want to go at it alone, these tips should help you choose an SEO company.


SEO Tutorial - Search Engine Optimization Starter Guide.

Няма коментари:

Публикуване на коментар