Search Engine Optimization

Search Engine Optimization Miami

 

[tabbed tabs=”Keyword Selection | Keyword Density | Redirects | Google Guidelines | Cloaking | Hidden”]

Keyword Selection for SEO

keyword selectionI called in the expert for this article. Mikkel has spent many an hour learning and discovering the way the search engine world works. Here are his thoughts on choosing keywords for your home page….

Most Internet users rely on search engines like Google™, Yahoo!®, MSN® Search, AOL® Search, etc. when searching for content on the Internet. This means that if you want to claim your share of the booming Internet market, you must ensure that the search engines are directing visitors to your Web site. In other words, you need to “search-engine optimize” your custom home page. The search engine optimization process begins with choosing the keywords for your site.

Researching Your Keywords

Keywords are words and phrases that describe the content of a Web page. The keywords must match the search terms Internet users are typing into search engines when looking for the type of content offered by your Web site. The proper keywords for your website should reflect the products and services you are selling and your particular niche market.

So how do you determine the right keywords for your website? The first thing you need to do is to put yourself in the Internet users’ place and imagine which search terms they would be using when looking for a deal on the products you are offering. Those words/phrases might include such terms as “domains,” “domain registration,” “Web hosting,” “e-mail accounts,” etc., depending on your product selection. And don’t forget keywords specific for your niche market.

Another good place to start your keyword research is to take a look at your competition. Simply look at Web sites of your main competitors on the Internet and take a close look at the keywords used in Title tags, Meta tags, page copy, etc. To learn who your main competitors are, simply go to one of the top search engines and type in the keywords you are considering for your website. The top results returned by the search engine will reveal the nature of your competition.

The following general rules should be kept in mind as you define the keywords for your website.

Be specific — generic keywords, such as “domains,” “hosting” or “e-mail,” could be featured on literally millions of Web sites. More specific words and phrases, like “Web Design in Miami“, will significantly narrow the amount of Web pages that rely on the same keywords. Generic keywords may be necessary, but should be combined with more specific phrases.

Be intuitive — the keywords must reflect words and phrases that potential customers would actually use when searching for Web content. Highly technical terms generally should be replaced with commonly-used ones.

Consider popularity — a keyword’s popularity is an indication of how many search engine users have searched for it. In theory, the more popular your keywords are, the more likely you are to attract customers. However, popular keywords generally mean tougher competition for the top search engine rankings. Your keywords, therefore, should hit the right balance between being popular enough to generate significant traffic to your site and being rare enough to actually allow you a chance of securing your site’s top ranking for them. A number of tools, including Wordtracker (http://www.wordtracker.com/) and the free Yahoo! Search Marketing Keyword Selector (http://inventory.overture.com/d/searchinventory/suggestion/), allow you to easily research the popularity of potential keywords for your site.

Narrowing Down Your List

Write down the words and phrases that come to mind as you perform your research. Then, having determined their popularity, boil the list down to two or three keywords that you can optimize your content for. These keywords should appear frequently in your Web pages’ Title and Meta tags, page copy, image Alt attributes, heading text, link anchor text, etc. A longer list of keywords can be compiled and used in the Keywords Meta tag on your Web pages.

The keyword selection process for your Web site is a crucial step in your quest for prosperity. Choosing and using the optimal keywords may well define the difference between turning your Web site into a virtual hotspot and allowing it to remain in relative obscurity.

Keyword Density for SEO

Keyword Density is the percentage of keywords compared to the rest of the text in your Web page. This metric is important because it gives you a tool to compare a Web page or site to that of similar pages with higher rankings. You can see how your use of keywords compares to theirs.

If you see a comparable keyword density between sites, chances are their higher ranking is due to inbound links and/or inherited page rank. If their keyword density is higher than yours, there’s a good chance you can increase your ranking with some careful keyword placement for organic search optimization.

Use one of the following free keyword cloud tools to check your site’s actual keyword density:

Visual Results

Use font size and bold-face to get a quick visual of what words have the highest density on the pages you search (without the actual statistics).

Keyword Cloud from webconfs: http://www.webconfs.com/keyword-density-checker.php

Keyword Density Checker from iwebtool: http://www.iwebtool.com/keyword_density

Statistical Results

These tools give you actual number of occurrences, percentages of density and other key metrics by keyword.

Keyword Density Tool from SEO Tools™: http://www.seochat.com/seo-tools/keyword-density/

Keyword Density from Link Vendor: http://www.linkvendor.com/seo-tools/keyword-density.html

Compare two sites

Use this tool to see how your site compares to another.

Keyword Density Analyzer from KeywordDensity.com: http://www.keyworddensity.com/

What do you do with the information once you have it?

* Get a good understanding of what keywords are strongest on your site. You might be surprised that your organic content is pointing search engines in a direction you didn’t expect. Evaluate that and decide if you need to modify your copy and tags/titles, or if it’s something you should use to your advantage and build upon.

* Take a new look for competition. Try searches on the top keywords in the major search engines and see who shows up. Take a look at their sites and see if they truly are competition. Review their offers to see how they compare to yours. Also be sure to read the search engine results set for their site compared to yours…whose is more compelling? How can you change your copy to better grab the potential customer’s attention if your results were to come up side-by-side

* Compare keyword density with your known top competitors. This might give you an idea who’s more likely to come out on top in the major search engines (of course, keyword density is only one factor in the mix – don’t forget that incoming links, overall relevancy, etc., also determine page rank). Use the results to help prioritize the copy you need to tweak in the future, or maybe set your goals to build some new content pages that will make you a stronger competitor in those areas.

* Target online directories where you can submit your site and increase incoming links. To ensure you get approved for inclusion, have a blurb that clearly ties your site to that directory.

302 Redirects

The “302″ refers to the HTTP status codes that are returned to your browser when you request a page. For example, a 404 page is called a “404″ because web servers return a status code of 404 to indicate that a requested page wasn’t found. The difference between a 301 and a 302 is that a 301 status code means that a page has permanently moved to a new location, while a 302 status code means that a page has temporarily moved to a new location. For example, if you try to fetch a page http://cpccci.com/ and the web server says “That’s a 301. The new location is http://www.cpccci.com/” then the web server is saying “That url you requested? It’s moved permanently to the new location I’m giving you.”

By definition, those are redirects from one domain A.com to another domain B.com that are claimed to be temporary; that is, the web server on A.com could always change its mind and start showing content on A.com again. The vast majority of the time that a search engine receives an off-domain 302 redirect, the right thing to do is to crawl/index/return the destination page (in the example we mentioned, it would be B.com). In fact, if you did that 100% of the time, you would never have to worry about “hijacking”; that is, content from B.com returned with an A.com url. Google is moving to a set of heuristics that return the destination page more than 99% of the time. Why not 100% of the time? Most search engine reserve the right to make exceptions when we think the source page will be better for users, even though we’ll only do that rarely.

GOOGLE WEBMASTER GUIDELINES

Webmaster Guidelines

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the “Quality Guidelines,” which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise penalized. If a site has been penalized, it may no longer show up in results on Google.com or on any of Google’s partner sites.

* Design, content, and technical guidelines * Quality guidelines

When your site is ready:

* Have other relevant sites link to yours. * Submit it to Google at http://www.google.com/addurl.html. * Submit a Sitemap as part of our Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages. * Make sure all the sites that should know about your pages are aware your site is online. * Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.

Design and content guidelines

* Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link. * Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages. * Create a useful, information-rich site, and write pages that clearly and accurately describe your content. * Think about the words users would type to find your pages, and make sure that your site actually includes those words within it. * Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images. * Make sure that your TITLE tags and ALT attributes are descriptive and accurate. * Check for broken links and correct HTML. * If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few. * Keep the links on a given page to a reasonable number (fewer than 100).

Technical guidelines

* Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site. * Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page. * Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead. * Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidentally block the Googlebot crawler. Visit http://www.robotstxt.org/wc/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you’re using it correctly with the robots.txt analysis tool available in Google Webmaster Tools. * If your company buys a content management system, make sure that the system can export your content so that search engine spiders can crawl your site. * Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.

Quality guidelines

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google’s quality guidelines, please report that site at https://www.google.com/webmasters/tools/spamreport. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.

Quality guidelines – basic principles

* Make pages primarily for users, not for search engines. Don’t deceive your users or present different content to search engines than you display to users, which is commonly referred to as “cloaking.” * Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?” * Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links. * Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.

Quality guidelines – specific guidelines

* Avoid hidden text or hidden links. * Don’t use cloaking or sneaky redirects. * Don’t send automated queries to Google. * Don’t load pages with irrelevant keywords. * Don’t create multiple pages, subdomains, or domains with substantially duplicate content. * Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware. * Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content. * If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

If you determine that your site doesn’t meet these guidelines, you can modify your site so that it does and then submit your site for reconsideration.

From Google Webmaster Guidelines: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769

Cloaking, sneaky Javascript redirects, and doorway pages

Cloaking

Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.

Some examples of cloaking include:

* Serving a page of HTML text to search engines, while showing a page of images or Flash to users. * Serving different content to search engines than to users.

If your site contains elements that aren’t crawlable by search engines (such as rich media files other than Flash, JavaScript, or images), you shouldn’t provide cloaked content to search engines. Rather, you should consider visitors to your site who are unable to view these elements as well. For instance:

* Provide alt text that describes images for visitors with screen readers or images turned off in their browsers. * Provide the textual contents of JavaScript in a noscript tag.

Ensure that you provide the same content in both elements (for instance, provide the same text in the JavaScript as in the noscript tag). Including substantially different content in the alternate element may cause Google to take action on the site.

Sneaky JavaScript redirects

When Googlebot indexes a page containing JavaScript, it will index that page but it cannot follow or index any links hidden in the JavaScript itself. Use of JavaScript is an entirely legitimate web practice. However, use of JavaScript with the intent to deceive search engines is not. For instance, placing different text in JavaScript than in a noscript tag violates our webmaster guidelines because it displays different content for users (who see the JavaScript-based text) than for search engines (which see the noscript-based text). Along those lines, it violates the webmaster guidelines to embed a link in JavaScript that redirects the user to a different page with the intent to show the user a different page than the search engine sees. When a redirect link is embedded in JavaScript, the search engine indexes the original page rather than following the link, whereas users are taken to the redirect target. Like cloaking, this practice is deceptive because it displays different content to users and to Googlebot, and can take a visitor somewhere other than where they intended to go.

Note that placement of links within JavaScript is alone not deceptive. When examining JavaScript on your site to ensure your site adheres to our guidelines, consider the intent.

Keep in mind that since search engines generally can’t access the contents of JavaScript, legitimate links within JavaScript will likely be inaccessible to them (as well as to visitors without Javascript-enabled browsers). You might instead keep links outside of JavaScript or replicate them in a noscript tag.

Doorway pages

Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination.

Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users, and are in violation of our webmaster guidelines.

Google’s aim is to give our users the most valuable and relevant search results. Therefore, we frown on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the ones they selected, and that provide content solely for the benefit of search engines. Google may take action on doorway sites and other sites making use of these deceptive practice, including removing these sites from the Google index.

If your site has been removed from our search results, review our webmaster guidelines for more information. Once you’ve made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.

If you’d like to discuss this with Google, or have ideas for how we can better communicate with you about it, please post in our Webmaster Help Group. From Google Webmaster Guidelines: http://www.google.com/support/webmasters/bin/answer.py?answer=66355

Hidden text and links

Hiding text or links in your content can cause your site to be perceived as untrustworthy since it presents information to search engines differently than to visitors. Text (such as excessive keywords) can be hidden in several ways, including:

* Using white text on a white background * Including text behind an image * Using CSS to hide text * Setting the font size to 0

Hidden links are links that are intended to be crawled by Googlebot, but are unreadable to humans because:

* The link consists of hidden text (for example, the text color and background color are identical). * CSS has been used to make tiny hyperlinks, as little as one pixel high. * The link is hidden in a small character – for example, a hyphen in the middle of a paragraph.

If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages. When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?

If you’re using text to try to describe something search engines can’t access – for example, Javascript, images, or Flash files – remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either. Using descriptive text for these items will improve the accessibility of your site. You can test accessibility by turning off Javascript, Flash, and images in your browser, or by using a text-only browser such as Lynx. Some tips on making your site accessible include:

* Images: Use the alt attribute to provide descriptive text. In addition, we recommend using a human-readable caption and descriptive text around the image. * Javascript: Place the same content from the Javascript in a no script tag. If you use this method, ensure the contents are exactly same as what is contained in the Javascript and that this content is shown to visitors who do not have Javascript enabled in their browser. * Videos: Include descriptive text about the video in HTML. You might also consider providing transcripts.

If you do find hidden text or links on your site, either remove them or, if they are relevant for your site’s visitors, make them easily viewable. If your site has been removed from our search results, review our webmaster guidelines for more information. Once you’ve made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.

If you’d like to discuss this with Google, or have ideas for how we can better communicate with you about it, please post in our webmaster discussion forum. From Google Webmaster Guidelines: http://www.google.com/support/webmasters/bin/answer.py?answer=66353

[/tabbed]

Pin It on Pinterest

Share This