Dec 20, 2009

Iframes, Please Make Way for SEO Poisoning

If a hacker managed to hack into your blog or website, what could they possibly do? They could insert malicious iframes or JavaScript code into your Web pages. Probably even attempt to steal some data. But most likely they would "search engine optimize" your website. Can this be true? Well, let me explain more.

Search engine optimization (SEO) is a collection of techniques used to achieve higher search rankings for a given website. "Black hat SEO" is the method of using unethical SEO techniques in order to obtain a higher search ranking. These techniques include things like keyword stuffing, cloaking, and link farming, which are used to "game" the search engine algorithms.

So what does a hacker gain from all this? Why would a hacker help you achieve a higher search engine ranking? Quite the contrary; he is helping himself.

What the hacker actually does is add numerous additional Web pages to your website. Let’s call each of these additional pages "fake" Web pages. Each fake page is based on a popular search topic and has content related to that topic. Most often the content is stolen from legitimate sites and feeds. The hacker also uses topic-related keywords in the URLs of these fake pages. Each of these fake Web pages is added without the website owner’s knowledge or consent. For example, if you own the site example.com, the hacker might add virtual pages such as:

• example.com/?ohio-voting-results
• example.com/?atlanta-mayoral-race-results
• example.com/?dancing-with-the-stars
• example.com/?nicole-narain

All of these fake pages would then redirect to content stolen from some reputable site related to the keyword.

SEO Updates

Now, if a legitimate user was to search for one of these keywords, he or she would encounter a reference to this fake Web page in the search engine results. The keywords in the URL, the keywords in the title, and the relevant content would all cause some search engine algorithms to place this Web page high in the search engine results. In other words, this fake Web page has "gamed" the search engine algorithm into believing that that it has relevant content, with respect to what is being searched for.

SEO Updates

But what does the hacker gain from getting a legitimate user to visit this fake Web page? After all, the visitor would simply be reading relevant information based on what was searched for. Well, not really. The Web server’s configuration file is changed by the hacker to recognize that the user is visiting this fake page after following a link from a search engine result page, and is then redirected to a fake antivirus or misleading application Web page, which is different from what the search engine spider actually sees. This is known as cloaking.

SEO

Cloaking is a black hat SEO technique in which the content presented to the search engine spider is different to that presented to the user’s browser. Search-engine crawlers spider through links in order to find and index Web pages. So when the search engine spider visits this page, it is presented with relevant information that is related to the search topic theme. In fact, most often the relevant keywords in the URL, title, and the content gives this fake Web page a higher ranking in the search engine results. However, when the user visits this fake page from the search engine result page he or she will be redirected to a fake scan Web page.

There are many different ways to achieve cloaking. One popular method is to look at the User-Agent string in the HTTP request. Search engine crawlers use specific strings in the user-agent field of the HTTP header. Using this, the Web server can serve a different page to the crawler. The referrer field can be used to ascertain if the user is coming from a search engine result page, and in that case redirect them to a fake scan website serving misleading applications.

SEO

So how do these websites get picked up by the search engine crawlers? There are several ways to do this. One can manually submit a website to search engine crawlers. Also, crawlers can spider through links, so a reference link on one website can get your website crawled and indexed. Additionally, many crawlers use sitemaps provided by the website owners in order to find all the pages on the website. Search engine advertising programs can also be used for getting indexed.

Search engines attribute importance to links to a website that exist on other websites. These links are called "backlinks" and indicate the popularity of a website. Backlinks will also get a website crawled and indexed as well as increase the page rank.

A "link farm" is a group of websites that have links to other websites in that group. Apart from other factors that contribute to a website gaining a good ranking, backlinks play a vital role—and a link farm provides a website with many back links. In fact, there are services such as Link Farm Evolution and SENuke available online, which allow for the creation of thousands of backlinks for a website.

Recently we came across a link farm for a group of fake pages that were serving up misleading applications. The link farm allows these fake pages to be indexed and therefore increases their page rankings.

SEO

Shown above is a snapshot of the link farm in question. You can see that each link is related to a recent real-world event, and each link ends with a keyword related to that event. All of these pages were created on legitimate websites that were hacked to serve these virtual pages.

Although the

    tag would prevent these links from being visible to the user, they are still visible to search engines. In addition, a normal user may never see these links, even in the HTML source, because this code is only served if the request was made by a search engine crawler. Shown below is a screenshot of another similar campaign:

    SEO

    So, you have now read how black hat SEO techniques are effectively employed to redirect victims to fake antivirus websites from search engine results. The following diagram gives a good visual summary of the typical actions that occur:

    SEO updates


    1. The hacker hacks a site to serve legitimate content to a search spider and malicious content to users.
    2. The hacker creates a link farm to the hacked site to be picked up by a search spider.
    3. A search spider crawls the link farm.
    4. The hacked site appears in the search results.
    5. A user clicks on the search result link leading to the hacked site, which redirects to the malicious page.

Canonical URL Tag

The announcement from Yahoo!, Live & Google that they will be supporting a new "canonical url tag" to help webmasters and site owners eliminate self-created duplicate content in the index is, in my opinion, the biggest change to SEO best practices since the emergence of Sitemaps. It's rare that we cover search engine announcements or "news items" here on SEOmoz, as this blog is devoted more towards tactics than breaking headlines, but this certainly demands attention and requires quick education.

To help new and experienced SEOs better understand this tag, I've created the following Q+A (please feel free to print, email & share with developers, webmasters and others who need to quickly ramp up on this issue):

How Does it Operate?

The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example:

link rel="canonical" href="http://www.seomoz.org/blog"

This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL www.seomoz.org/blog and that all of the link & content metrics the engines apply should technically flow back to that URL.

Canonical URL Tag

The Canonical URL tag attribute is similar in many ways to a 301 redirect from an SEO perspective. In essence, you're telling the engines that multiple pages should be considered as one (which a 301 does), without actually redirecting visitors to the new URL (often saving your dev staff considerable heartache). There are some differences, though:

  • Whereas a 301 redirect re-points all traffic (bots and human visitors), the Canonical URL tag is just for engines, meaning you can still separately track visitors to the unique URL versions.
  • A 301 is a much stronger signal that multiple pages have a single, canonical source. While the engines are certainly planning to support this new tag and trust the intent of site owners, there will be limitations. Content analysis and other algorithmic metrics will be applied to ensure that a site owner hasn't mistakenly or manipulatively applied the tag, and we certainly expect to see mistaken use of the tag, resulting in the engines maintaining those separate URLs in their indices (meaning site owners would experience the same problems noted below).
  • 301s carry cross-domain functionality, meaning you can redirect a page at domain1.com to domain2.com and carry over those search engine metrics. This is NOT THE CASE with the Canonical URL tag, which operates exclusively on a single root domain (it will carry over across subfolders and subdomains).

Over time, I expect we'll see more differences, but since this tag is so new, it will be several months before SEOs have amassed good evidence about how this tag's application operates. Previous rollouts like nofollow, sitemaps and webmaster tools platforms have all had modifications in their implementation after launch, and there's no reason to doubt that this will, too.

How, When & Where Should SEOs Use This Tag?

In the past, many sites have encountered issues with multiple versions of the same content on different URLs. This creates three big problems:

  1. Search engines don't know which version(s) to include/exclude from their indices
  2. Search engines don't know whether to direct the link metrics (trust, authority, anchor text, link juice, etc.) to one page, or keep it separated between multiple versions
  3. Search engines don't know which version(s) to rank for query results

When this happens, site owners suffer rankings and traffic losses and engines suffer lowered relevancy. Thus, in order to fix these problems, we, as SEOs and webmasters, can start applying the new Canonical URL tag whenever any of the following scenarios arise:

Canonical URL Issues for Categories

Canonical URLs for Print Versions

Canonical URLs for Session IDs

While these examples above represent some common applications, there are certainly others, and in many cases, they'll be very unique to each site. Talk with your internal SEOs or SEO consultants to help determine whether, how & where to apply this tag.

What Information Have the Engines Provided About the Canonical URL Tag?

Quite a bit, actually. Check out a few important quotes from Google:

Is rel="canonical" a hint or a directive?
It's a hint that we honor strongly. We'll take your preference into account, in conjunction with other signals, when calculating the most relevant page to display in search results.

Can I use a relative path to specify the canonical, such as ?
Yes, relative paths are recognized as expected with the tag. Also, if you include a link in your document, relative paths will resolve according to the base URL.

Is it okay if the canonical is not an exact duplicate of the content?
We allow slight differences, e.g., in the sort order of a table of products. We also recognize that we may crawl the canonical and the duplicate pages at different points in time, so we may occasionally see different versions of your content. All of that is okay with us.

What if the rel="canonical" returns a 404?
We'll continue to index your content and use a heuristic to find a canonical, but we recommend that you specify existent URLs as canonicals.

What if the rel="canonical" hasn't yet been indexed?
Like all public content on the web, we strive to discover and crawl a designated canonical URL quickly. As soon as we index it, we'll immediately reconsider the rel="canonical" hint.

Can rel="canonical" be a redirect?
Yes, you can specify a URL that redirects as a canonical URL. Google will then process the redirect as usual and try to index it.

What if I have contradictory rel="canonical" designations?
Our algorithm is lenient: We can follow canonical chains, but we strongly recommend that you update links to point to a single canonical page to ensure optimal canonicalization results.

from Yahoo!:

• The URL paths in the tag can be absolute or relative, though we recommend using absolute paths to avoid any chance of errors.

• A tag can only point to a canonical URL form within the same domain and not across domains. For example, a tag on http://test.example.com can point to a URL on http://www.example.com but not on http://yahoo.com or any other domain.

• The tag will be treated similarly to a 301 redirect, in terms of transferring link references and other effects to the canonical form of the page.

• We will use the tag information as provided, but we’ll also use algorithmic mechanisms to avoid situations where we think the tag was not used as intended. For example, if the canonical form is non-existent, returns an error or a 404, or if the content on the source and target was substantially distinct and unique, the canonical link may be considered erroneous and deferred.

• The tag is transitive. That is, if URL A marks B as canonical, and B marks C as canonical, we’ll treat C as canonical for both A and B, though we will break infinite chains and other issues.

and from Live/MSN:

  • This tag will be interpreted as a hint by Live Search, not as a command. We'll evaluate this in the context of all the other information we know about the website and try and make the best determination of the canonical URL. This will help us handle any potential implementation errors or abuse of this tag.
  • You can use relative or absolute URLs in the “href” attribute of the link tag.
  • The page and the URL in the “href” attribute must be on the same domain. For example, if the page is found on “http://mysite.com/default.aspx”, and the ”href” attribute in the link tag points to “http://mysite2.com”, the tag will be invalid and ignored.
    • However, the “href” attribute can point to a different subdomain. For example, if the page is found on “http://mysite.com/default.aspx” and the “href” attribute in the link tag points to “http://www.mysite.com”, the tag will be considered valid.
  • Live Search expects to implement support for this feature sometime in the near future.

What Questions Still Linger?

A few things remain somewhat murky around the Canonical URL tag's features and results. These include:

  • The degree to which the tag will be trusted by the various engines - will it only work if the content is 100% duplicate 100% of the time? Is there some flexibility on the content differences? How much?
  • Will this pass 100% of the link juice from a given page to another? More or less than a 301 redirect does now? Note that Google's official representative from the web spam team, Matt Cutts, said today that it passes link juice akin to a 301 redirect but also noted (when SEOmoz's own Gillian Muessig asked specifically) that "it loses no more juice than a 301," which suggests that there is some fractional loss when either of these are applied.
  • The extent of the tag's application on non-English language versions of the engines. Will different levels of content/duplicate analysis and country/language-specific issues apply?
  • Will the engines all treat this in precisely the same fashion? This seems unlikely, as they'd need to share content/link analysis algorithms to do that. Expect anecdotal (and possibly statistical) data in the future suggesting that there are disparities in interpretation between the engines.
  • Yahoo! strongly recommends using absolute paths for this (and, although we've yet to implement it, SEOmoz does as well, based on potential pitfalls with relative URLs), but the other engines are more agnostic - we'll see what the standard recommendations become.
  • Yahoo! also mentions the properties are transitive (which is great news for anyone who's had to do multiple URL re-architectures over time), but it's not clear if the other engines support this?
  • Live/MSN appears to have not yet implemented support for the tag, so we'll see when they formally begin adoption.
  • Are the engines OK with SEOs applying this for affiliate links to help re-route link juice? We'd heard at SMX East from a panel of engineers that using 301s for this was OK, so I'm assuming it is, but many SEOs are still skeptical as to whether the engines consider affiliate links as natural or not.

Sep 23, 2009

Traffic Generation Techniques

Everyone wants to know how to generate more web site traffic unless you know them you can’t run a successful business. There are many techniques with which you can easily generate/drive traffic to your website. I want to explain some high-impact traffic generating techniques so that you can use implement them to minimize time and to maximize the results.

1- Social Bookmarking

2- Social Networking

-MySpace

-FaceBook

3- Web 2.0 Pages

4- Article Submission

5- Press Release

6- Forum Posting

7- Blog Commenting

8- Directory Submission

9- Classified Ads Posting

10- Regional Traffic

More….

The main goal of this traffic generation is nothing but maximizing your Search Engine Rankings so that your website would get displayed in the Top of SE’s for your main targetted keyword.

Implement all the strategies that I have mentioned above once you are done with the On-Page Optimization which I have been discussing so far..these are the minimum things which you have to make sure that you followed all factors that I have mentioned.

Jul 6, 2009

How to avoid duplicacy of pages and content in websites

Duplicacy of pages add no real content to the web and do nothing to improve the quality of the search engines output. By offering the same document over and over search engines would offer a horrible product with no variety. For this reason, duplicate pages are often filtered out. This is the exact reason Yahoo! dislikes many affiliate websites.

Duplicacy of content and pages are major cause for search engine optimization.Even if search engines like Google will not always penalizes you for duplicate content, they will still have to choose one of the URLs / pages to show in search results and ignore all other duplicates.

If problem is related with WWW-pages and non WWW-pages URLs in the generated XML sitemap will then either be with or without www which may differ from the paths search engines such as Google.

To avoid duplicate content use canonical URL in wordpress.This is the definition given in Google’s Webmaster Central Blog a format that allows you to publicly specify your preferred version of a URL. If your site has identical or vastly similar content that’s accessible through multiple URLs, this format provides you with more control over the URL returned in search results. It also helps to make sure that properties such as link popularity are consolidated to your preferred version.

It is very easy to specify the canonical URL of a page – all you have to do is add this line in the head section…

The purpose of this tag is to prevent the problem of duplicate content.

The best thing about this tag is that all three major search engines(Google, Yahoo and MS Live) support this initiative.
header.php

Go to the theme folder in wordpress(wp-content/themes/
) and open the file header.php in your favorite editor. Now find the line…

and add the following code before that line…



" />

Now, it should look something like this…

" />

That’s it - you are done! Save the file and upload it to your server.

This fix will add the canonical URL tag to all the posts and pages of your blog. If you want to see a blog that have implemented this fix, take a look at the source of this page.

What is a canonical URL?

If you already know you would be surprised at how few webmasters have ever heard of a canonical URL. If you also spend time in Google Groups you will find people scratching their heads and not getting a clear answer as to how to troubleshoot and resolve this troublesome issue in Google.


Jul 2, 2009

PR Update and SEO Updates

Search engines are changing their techniques day by day. So every coming day brings some SEO updates for webmasters. Search engine like Google use to change its algorithms after couple of weeks that’s why some time PR update brings some unexpected and shocking results.
As every body use to search the desired information through some search engine, most probably Google. Obviously, every one wants to touch relevant date instead of wandering around the different sites. Search engine help us by sorting out the sites in such a way that the most relevant site comes up in top results. So in order to bring their sites among the top results webmasters use different techniques and the process of bringing these sites at the top of the list by using SEO updates in legalized way is called white hat SEO.

Search engines use different algorithms in order to sort out the data in better and faster way. However, the most successful and popular of all algorithms was developed in Stanford University by Larry Page and Sergey Brin. This algorithm was named Page ranking.  The prototype of this algorithm was launched in the market in 1998 with the name of “Google”.

Page rank algorithm was based on votes. For example, a link from site X to site Y is counted as a vote from site X for site Y. The site which has the more votes means have more back links will be considered more important and will get a higher “page rank”. One interesting fact is that Google assess the importance of a site through the sites from where it is getting back links. No doubt every back link is considered as a vote but different votes have different values. That means if your site get a back link from a site which has a good page rank and healthy back links that will ultimately affect the ranking of your site in next PR update.

So we can say that more and more back links is not the only way of optimizing your site for high search engine ranking. Rather then that if you want to see your site with good page rank after next PR update then don’t go for back links only, instead of this try to go for quality back links. Rather then back links, search engine also consider many other factors while allocating page rank, so webmasters should keep an eye on SEO updates, in order to get the idea about the working of search engine algorithms. Some times sites having healthy back links fail to get good page rank during PR update rather then that Google penalizes those sites which use to get back link by using black hat techniques. While if you will upload sites of same nich on the same ip of the hosting it may effect the google page rank of your sites.

In short we can say that search engines optimization is not a rocket science rather then it is the name of continuously changing field, in which you can get guidance through SEO updates or you can also try your ways by following search engine guidelines. It really needs some thing beyond expertise. People who are working in this widespread field of search engine optimization love to get useful SEO updates.

Jun 5, 2009

Microsoft's Bing goes live in preview (Update 2)

The buzz last week and building overnight (on Twitter) was that Microsoft's new search product, Bing, was going live early.


This morning, users of Live.com as well as MSN search got a little surprise ... two days early.

Prior to an upcoming multi media advertising blitz valued at 80-100 million dollars, Bing is now live in most time zones in 'preview mode'.

Bing Travel, born out of Microsoft's Farecast acquisition, is also now live as well as other new members of the 'Bing family', not exclusive of Bing Cashback.

The Bing preview video (using Silverlight) had been up for the past three days. Search experts and those just wondering how Bing will effect the search landscape can now test the product without invitation and, no doubt, we'll see a few thousand articles posted throughout the day.

Bing is being promoted not as a 'Internet Search Engine' but as a decision making product. Users with specific questions will be able, if all goes well for Microsoft, to type in a question or phrase and receive some relevant links in return.

With a few memories of 'Ask Jeeves' or maybe Yahoo Answers ... Bing is here.

Update 1: The Bing Community is also now active and getting busy along with a forum for feedback and questions. A few blogs within the community: Bing Maps, Bing Travel, Bing Developer, and the new Webmaster Community in Bing.

May 29, 2009

Sender Time Zone : New in Gmail Labs

Gmail grows day by day and adding more features in Labs and make those features live that which are gaining the hearts of users. On April 8th 2009 night Gmail has released a new product in labs called "Sender Time Zone". This is a nice feature. Using this feature you can find what time in your contacts country. 

Don't get confused. For example you have friends or clients world wide. For example you have received an important mail from client or friend. You would like to discuss about that issue immediately to him or her and wants get it resolved. If it is early morning to your client or friend, then it's so bad to disturb them in a deep sleep. So using this feature just simply have a look on time zone (as shown in the image) and then call him if it's the right time or else wait for few hours. (click on the Image to enlarge)


If you have all local friends then it won't much use for you, because yours and their time zone is same. For professionals and Global consultants have found this feature very useful who have global network and also interacting with them often.

But here is a problem again. If you are in India and your system has set to US timings then it may show your time wrong. So it would be better to set your system according to your time zone. Of course most of the people will set their pc time zone as per their time zone only, however i just want to make you aware of this.

May 28, 2009

Google Flash SEO

In case you missed their recent announcement about Flash, Google and Adobe have teamed up on a new algorithm to index text content in Flash. As a result of the new algorithm for Flash, Googlebot now indexes “textual content in SWF files of all kinds” and extracts URLs embeded in Flash.” On July 1, 2008, Google rolled out another update designed to assist Googlebot in its ability to traverse simple JavaScript, like SWFObject. While the full impact is not yet known, these technologies will redefine how Flash sites are created, constructed, designed and, as a result, optimized.

Before discussing the results of my case studies and how to optimize an actual scenario Flash site, it’s important to understand some crucial background information.

Prior to the introduction of Google’s Flash algorithm, Googlebot crawled only (X)HTML architectures without executing JavaScript to access text content and/or URLs in Flash. These new capabilities raise a number of questions about how Google handles, crawls and indexes content in Flash. Does Google index all text content in Flash? Does Google associate text content in Flash with the correct parent URL? Does Googlebot crawl links containing “pound signs” in URLs? Can Flash files have PageRank? How does Google treat underlying interlinked (X)HTML structures of corresponding text content pages? What about “Progressive Enhancement” techniques? To answer these and other questions, I’ve been testing the effectiveness of “Google’s Flash algorithm” since its inception to find out what it means for current design practices.

Googlebot Flash Update Case Studies

Before looking at how to optimize an actual scenario site, it’s important to establish the new “laws of the land,” so to speak. For that reason, I’ve conducted a number of experiments with various sites but have included only a few case studies below. These case studies lay a foundation in terms of understanding how Googlebot now interacts with Flash since Google’s new algorithm for Flash was introduced. For demonstration purposes, I’ve used Google’s example from Google Webmaster Central Blog.

Google Flash Update Case Study #1:

Google Flash Content Association with Parent URLs

Experiment: To determine if Google associates text content embedded in Flash inside an (X)HTML page with the correct “parent” URL as a single entity.

Hypothesis: Google currently still does not associate text content in Flash with the correct parent URL or as a single entity.

Background: According to Google’s Official Webmaster Blog, “If the Flash file is embedded in HTML (as many of the Flash files we find are), its content is associated with the parent URL and indexed as single entity.” To support their claim, Google posted the following image: Google Flash SERP

Procedure: To test their claim, I used Google’s example query [nasa deep impact animation].

Result:
Google Flash Result

Conclusion:
Since the introduction of support for SWFObject in July, Google hasn’t associated text content in Flash with the correct parent URL or as a single entity. More often than not, either the Flash URL or both the Flash and parent URL are indexed.

*For test validity, I’ve monitored Google SERPs (search engine results pages) for the above query daily over a period of 60 days. I’ve also monitored Google’s SERPs for the following queries over the same period of time: [deep impact amy walsh] and [deep impact impact with comet Tempel 1].

Google Flash Update Case Study #2:

Google Flash File PageRank

Experiment: To determine if Flash files can accrue PageRank.

Hypothesis: Flash files can accrue PageRank.

Background: In a recent interview with Eric Enge, Maile Ohye mentioned that links in Flash function as a regular links and, therefore, can pass PageRank. If links in Flash can pass PageRank, it seems they could also accrue PageRank.

Procedure: Again using Google’s example, I visited both the parent and child URLs and recorded their “Toolbar” PageRank.

Results:
Google Flash SERP PageRank

The (X)HTML parent URL page has a “Toolbar PageRank” of 7 while the Flash file URL (.swf) page has a “Toolbar PageRank” of 6.

Conclusion:
Flash files can accrue PageRank independent of their own parent URLs.

*Note: The illustration in this case study shows both the parent and child URLs indexed as unique individual entities in Google’s SERPs (search engine results pages). This further supports the findings in Case Study #1.

Google Flash Update Case Study #3:

Googlebot #anchor (fragment identifier) URL Extraction

Experiment: To determine how Googlebot handles URLs containing #anchors (fragment identifiers).

Hypothesis: Googlebot ignores #anchors (fragment identifiers) in URLs and, as a result, extracts only URLs preceding #anchors (fragment identifiers) in Flash embedded links.

Background: According to Google’s own JohnMu, “When we find URLs with an anchor attached like that (http://domain.com/page#anchor) we generally ignore the anchor part, since it is not relevant when fetching the contents of a URL.” While this is a convention commonly used for playhead control in Flash sites, it refers to the same page as defined by W3C.

Procedure: To test the experiment, I used Google’s “inurl:” operator to search for instances where Google had indexed a URL containing a pound sign. The queries I used are [inurl:#] and [inurl:& # 35;].

Result: No results found.

Conclusion:
Google doesn’t index URLs containing #anchors (fragment identifiers) in Flash per W3C Guidelines.

Google Flash Update Case Study #4:

Google Flash Text Translation

Experiment: To determine if Google can translate text content in Flash.

Hypothesis: Google can not translate text content in Flash.

Background:Google’s mission is to organize the world’s information and make it universally accessible and useful.” To some, “universal accessibility” would imply translation.

Procedure: To test the experiment, I used Google’s translation tool to translate the case study example into French, Spanish, Chinese, Arabic and Russian.

Result: No results found.

Conclusion:
Currently, Google doesn’t seem to support translations of text content in Flash.

Googlebot Flash Interaction Scenario

Avenues for optimizing Flash differ, but the final destination remains the same in terms of organic search engine optimization. The scenario below reveals the basics of how Flash sites are typically optimized. In addition to the description, I included images to help further illustrate the dynamics involved.

Googlebot

“SEO for Flash” is simple in theory; embed an entire “site” within a Flash file and layer that Flash file over an interlinked (X)HTML structure of corresponding content pages via JavaScript. Thanks to the JavaScript (SWFObject), users with Flash enabled see Flash, while users without Flash enabled (previously Googlebot) receive the underlying interlinked (X)HTML version of content pages. In order to control the Flash presentation for users with Flash, URLs with #anchors are embedded to create the illusion of “seamless transitions” between “virtual pages” within the Flash file.

(A second iteration of this same technique draws text content seen both in the Flash presentation as well as the underlying (X)HTML from the server. However, Google does not attach external content resources loaded in Flash files. “If your Flash file loads an HTML file, an XML file, another SWF file, etc., Google will separately index that resource, but it will not yet be considered a part of the content in your Flash file.”)

While the scenario above seems fairly simple, understanding how each element is “digested” by Googlebot is a bit more complex. Here are a few issues to be aware of when optimizing Flash sites in light of the new Flash algorithm.

“Progressive Enhancement”

As illustrated by Case Study #1, Googlebot traverses simple JavaScript, like SWFObject and, as a result, it completely circumnavigates text content provided via “Progressive Enhancement” in most cases.

Flash in SERPs

As Case Study #1 and #2 illustrate, Google may not associate text content in Flash with the appropriate parent URL and/or as a single entity. This makes it possible for users without Flash-enabled browsers and/or devices (iPhone) to access Flash files directly from Google’s SERPs. This issue can result in a bad user experience.

Links

As illustrated by Case Study #3, Googlebot ignores pound signs (#anchors / fragment identifiers) in URLs. As a result, Googlebot interprets URLs containing pound signs as different URLs with different content than intended. (After pointing this issue out to the creators of SWFAddress, Asual added the “Copy link to clipboard” option to footer of pages in their SEO example.) This issue is further complicated by the introduction of Google’s new algorithm for Flash, the support for simple JavaScript and the possibility of Flash files being indexed in Google search engine results pages.

Googlebot
If a user posts a link to domain.com/photos.html#/contact.html, Googlebot will only see and index the content at domain.com/photos.html.

PageRank / Keyword Thinning

As illustrated by Case Study #2, Flash files can now accrue PageRank independent of their own parent URLs. As a result of this issue, PageRank thinning is likely to occur, because PageRank is divided between the parent URLs and actual URL. The percentage of thinning is likely to increase in proportion to the quantity of underlying (X)HTML pages containing “the flash file.”

Similar to PageRank, keyword relevancy may not be allocated to the intended URL. When keyword relevancy that is intended for one URL is instead allocated to another URL, thinning occurs.

Translation

As Case Study #4 illustrates, Google doesn’t seem to translate text content in Flash files, especially when text is supplied by a server or some other third party source.

Google SEO for Flash

Before talking about SEO for Flash, it’s important to define what that really means. Ask a Flash guy what SEO for Flash means and he might say something like “indexed content” or “indexation.” Ask an SEO guy what SEO for Flash means and he might say something like “ranking top 10 or text content in Flash.” As you can see, there are two different definitions and, therefore, two totally different expectations at work here. To an SEO, indexed content is the starting point where SEO for Flash begins. Simply “being indexed” is better defined as “Search Engine Friendly” (SEF). The introduction of Google’s Flash algorithm means most Flash sites are, by default, search engine friendly assuming text content resides within the Flash files and isn’t in a vector format.

Organic search engine optimization (SEO), unlike search engine friendliness (SEF), depends heavily on “meta data,” not just “meta tags.” Lots of information can be gleaned from (X)HTML by search engines via, TITLE elements, ALT attributes, images, headers (H1, H2, H3, H4…), internal link structure, fonts, link popularity, relationships, site categories, subdivisions and sections. Engines rely on these elements for meta data as well as other informational “signals” used for rankings. “More data is good data” but only when that data is available in a digital format that’s digestible by search engines and can be translated to determine relevancy for textual queries. As Vanessa Fox recently pointed out, the lack of structural meta data in Flash is a real disadvantage.

With Google’s introduction of “Universal search” in May 2007, Flash sites were dealt a new obstacle. Universal blends results from verticals like news, images and YouTube, in Google’s search results. The advent of “Universal search” is somewhat problematic for Flash sites, because Googlebot can’t extract images and/or video embedded in Flash for inclusion in Google’s “Universal” search results. To illustrate, currently Adobe.com doesn’t rank top ten in Google Images for [Adobe] or [Adobe logo].

Google Flash SEO Tips for 2009

Since “optimizing Flash” is difficult, it’s better to understand the fundamental limitations of the medium in terms of search, and to then concentrate on optimizing site design and architecture.

  • When it comes to text, “Don’t use it for something when there’s already a standard whose output can be easily parsed, easily processed, and whose openness makes its processing easier for browsers and searchbots.” - “Bergy,” Google Webmaster Central
  • Avoid text content and links in Flash
  • Don’t use text content in Flash supplied via third party file
  • If you must use text content in Flash, use sIFR
  • If you must use text content in Flash and sIFR isn’t an option, create individual Flash files laid over each corresponding (X)HTML page via SWFObject
  • “Instead of including everything in one flash file it may make sense to break the content into different flash files so you can create different HTML pages around the different ideas contained in it.” - Aaron Wall, SEOBook.com
  • When using SWFObject, consider using absolute URLs in underlying (X)HTML and Flash files
  • When using SWFObject, be sure to include “alternative” images for users without Flash
  • Avoid using text content in Flash for pages employing “seamless transitions” where URLs don’t change, or, instead, include “pound signs”
  • Provide links to important pages within Flash files using absolute URLs for users who arrive at the Flash file via Google search engine results pages
  • Consider how translation issues may impact content in Flash and investigate ways of working around these issues

May 20, 2009

Free Tips To Generate Traffic and Link Popularity

The main objective of any Search Engine Optimization campaign is to generate traffic and attract more and more visitors by using different ethical strategies. Building link popularity is one of the most effective ways to make popular your website on the web and also getting traffic to your website. In this article you can find some important tips to generate traffic and obtain inbound links for your website:

1. Article Submission: Submitting article to various article submission directories is a better and effective way to obtain inbound link and generate traffic to your website. If you are an expert of your field and can write good content then write articles related to your business and submit them to article submission directories. Always remember to include a resource box having a link back to your website. If your article is impressive and informative to users then it will be published and increase your traffic as well as link popularity.

2. Directory Submission: Submitting your website to related web directories is an easy way to increase link popularity and generate traffic to your website.
By submitting your website to a directory you will get an inbound link that increases your link popularity. Often webmasters and many other people go to visit various web website directories for specific information so it also helps to generate traffic to your website.

3. Blogging: Business blogs are also an effective way to generate traffic as well as obtain inbound links to your website. Create a blog related to your business and make regular posts on that. Blogs are easily accessible and you can easily add or modify content in your blog. Always include a link back to your website in each and every post of your blog that will help to divert traffic to your website. If your blog posts are unique and informative then visitors may return back to your blog time to time for more information and it will increase traffic to your blog as well as your website and help to increase your link popularity.

4. Online discussion forums: Online discussion forums are also an excellent way to generate targeted traffic and building link popularity to your website.

There are three ways to participate in online forums. You can post any useful information related to general topic of that forum by starting a new thread, you can post your queries and also respond to requests for help. It is beneficial to take part in discussions in these forums, as they often provide you a signature box where you add a link to your website. These signatures appear in your each and every post. In this way it increase link popularity and targeted traffic to your website at no cost.

5. Press Releases: Press releases considered not only generating traffic and building link popularity to your site but also effective way to get noticed your business in media. You can write a well crafted content for press release or attempt to write something unique that will be useful to its users, write as shorter as feasible to maximize your efficiency and speed and submit it to popular press release sites to publish. Do not forget to include a link back to your website in press release.

6. Link Exchange request: Sending Link exchange request to other relevant websites is one of the most effective ways to generate traffic and building link popularity to your website. Only focus on sites relevant to your website but not the direct competitors and send them request to exchange links with your site. Make sure to have something unique to attract other webmaster to link to your site.

It results in long term so it requires a little patience. Avoid link farms and try to get links from authority websites, edu and gov sites and also focus on getting links from older websites.

Above tips will be useful to you, follow them and sure link popularity and traffic to your site will increase day by day. If you have any suggestions or comments feel free to share with me….

Top SEO Tips To get higher Search engine Rankings

Getting listed in search engines is not a difficult task but what matters how have your site been listed in a search engine. Your site can get higher ranking in all search engines if you follow ethical SEO strategies. In this article you can find some important SEO tips which can help you to get higher Search engine rankings:

1.Do Research on Keywords: Keyword being the weapon of your success in SEO as search engines work as per your keywords hence be careful while selecting right keywords. It is widely advisable to use Overture or Wordtracker tools to select right Keyword.

2.Work On Content: It is widely said that for websites ‘Content is King’. So concentrate on your web content and make sure that the content provides some unique information to your visitor and not duplicated from elsewhere. While writing content for your web pages, insert vital keywords which relate to the products and services that your company offers.

3.Proper Design: Design your site that clearly represent your business and make an easy navigation based on your keyword research. Your overall webdesign should be simple, nice and user friendly. Make web pages that are easy to load. Don’t make pages with large images, if images are necessary; make a thumbnail and must add a description of the image.

Be Search Engine Spider Friendly: Search Engines read the websites content through its spiders or crawler. These spiders go through the site and categorized them as per the need of searchers. Always make sure that you have HTML link in your main navigation on each page. Avoid javascript link and menus, flash, graphics as search engine spider’s can not read them.

5. Build incoming links to your site: Incoming links are very, very important for SEO and also possibly the hardest part of SEO to implement. One way Incoming Links always have more importance than reciprocal links. Avoid link farms and try to get links from authority websites, edu and gov sites as they have more weightage than others and also preferable by search engines. Writing articles and submitting them to various article directories is also a better way to get free one-way incoming links and it also build a good reputation at the same time.

6. Don’t try to fool the search engines: Search Engine spiders are now more clever than you think and Search Engines update their algorithm time to time so don’t try to make them fool otherwise your site will be banned from search engines. So avoid cloaking, link farms, keyword stuffing, alt text spamming or other and unethical methods.

Have a little patience and follow the Ethical methods of SEO. It will surely give you your desired results

May 4, 2009

What Is the Next Step After Optimizing Your Online Traffic?

As most webmasters already know, making money online is essentially about optimizing traffic and monetizing that traffic.  Optimizing traffic refers to two things:

  • getting targeted visitors to come to your site (meaning that if you sell an anti-virus software, you want people that come to your site to be looking for those) [QUALITY]; and

  • getting as many of those visitors as possible [QUANTITY].

Once you have optimized your site in terms of attracting good traffic, what should you do to make money out of it?  It depends on the purpose of your presence online (by the way, if you're not clear as to what is your purpose in making a website, you better stop and reflect on it before going on... otherwise it's like walking in the dark with no sense of direction).  People looking to make money online normally fall under three categories:

  • Webmasters looking to build good resources sites and make money out of the traffic that comes.  For instance, if you are passionate about chess, you may want to build a website discussing chess strategies.  Your main purpose is therefore to have an informational site and you will probably make money by "selling" your traffic to advertisers through various means such as pay-per-click, pay-per-impression, or pay-per-action.

  • If you are really passionate about chess, you can go a step further and build a blog that will discuss your perspective on chess.  This undertaking will involve more work since you will need to write entries on a regular basis if you want to build your readership.  Overall, however, the concept is the same as above in terms of monetizing your traffic.  You will probably make money by selling your traffic to advertisers.

  • Jumping from informational sites to sites that sell products/ services now...  If you are a manufacturer of board games, your main purpose is not to sell information per se, but to sell your product online.  If that is the case, you will monetize your traffic by selling your products and not by "selling" your traffic as is the case for informational sites.

All that being said, most webmasters learn through trials and errors.  As you will probably learn if you are serious about making money online, good webmasters and those that succeed all have at least two qualities in common.  They have lots of patience and perseverance!  So next time you feel like giving up, remember that!

Mar 24, 2009

3 Key tools of a web designer or web marketer

In order to design a successful website, you need to ensure that it is appealing to the eye of those who will be using it. The appeal of a website is critical due to the fact that you are not going to be able to make face to face contact with your actual website visitors; and the first impression they get from your website should be a memorable one. Since you cannot convince your ‘customer’ in buying your product or visiting your web page by the means of one-to-one conversation, you should aim to use your website in order to encourage sales through an impressive design. A web page or its design becomes utterly useless when web designers or web marketers do not think about their website in this fashion. Being focused on the appeal of your web page will ensure that it becomes as successful as you want it to be.

This article will list the 3 important key tools in creating an impressive website:

Tool # 1 – E-Mail

An analysis pertaining to the ways and reasons that people use the internet has revealed some startling facts. Several research and studies were carried out, and the results have demonstrated that most people make use of the internet primarily to send and keep up to date with their emails. The people who use the internet for this purpose often refer to surfing the World Wide Web as their secondary objective. From this discovery, you can imagine the importance of having an email id that contains the name or nature of your company or even your web page’s domain name. Moreover, having possession of an email account like this, gives you customers or consumers the impression that your company is thorough and professional. If you are using your website as your email id, it conveys that you see your website as an important extension to your business.

Tool # 2 – Online Payment options

Nowadays most people prefer paying for products and services online rather than issuing a check or paying by cash. Hence, irrespective of whether you are a web page designer or a web marketer it is highly important that you have a fair amount of knowledge on this subject. For those who are actual web designers, you should be aware of incorporating online payment features within the web sites that you design. For those who are web marketers, you should aim to have the online payment features enabled on your web site so that customers who make use of your services, have the ability to pay you online.

Tool # 3 – Web page design

You need to be aware of how well and in what ways you can present your website, in terms of design. Whether you are a web page designer or a web marketer, this aspect is actually the most important. Not all professionals need to be aware about all aspects of design and it is not necessary for you to be the best web designer in the world, you just need to be able to understanding what type of design will appeal to your target customers. They key is to design a website that represent what you are trying to sell.

Although there are quite a number of tools available for a web designer and marketer, the above three are the most important and effective.

How to achieve keyword optimization?

Keyword optimization is very critical and important when it comes to ensuring that you have a competitive web site that generates a lot of traffic. Online businesses heavily thrive in search engine listings if the website representing the business has good keyword optimization.

Keyword optimization is very important because of the following factors:

• The right keywords attract a higher number of potential customers; and the majority will actually end up buying your products or utilize your services
• Websites that are built with good keywords, earn the highest of ranks among popular search engines
• By including good keywords into your web site scripts, you have the potential to save huge amounts of money that will be required to spend on other forms of advertising

It is a fact that web pages designed with specific keywords describing the product or services that you host, have the ability to rank higher in search engines. However, care must be taken when using key words as web pages designed with a higher number of keywords tend to lose focus and hence, earn very low ranks in search engine searches.

The following points are just a few tips on finding the right keywords in order to attract your target audience:

• Try and imagine yourself in the shoes of your potential customer and think like them - Think about what you would like to find within your web site and what features or information would tempt you into purchasing your product or service.
• Try and include descriptive phrases as your keywords rather than simple ones. For instance if you are selling candles online, try and use ‘aromatic incense candles’ or ‘painted gel candles’, etc. Choose phrases that contain words that best describe your product or service.
• Make use of keyword selector tools such as the one provided by Overture at http://inventory.overture.com/d/searchinventory/suggestion/.
• Take help from your friends, relatives and colleagues in order to find out what words they will type in search engines to find the product or service that you sell.
• Use the Webster’s dictionary or thesaurus to find similar words or synonyms that describe your product or service and consider all three forms of that particular word.
• In case you already own a web site, try and incorporate a search box/form for visitors to make use of. This search box/form should find items or information within your own web site and will also give you a great insight as to the different words that people use while searching for items, products or services online.

You can also incorporate your websites’ server log to preview what potential customers are looking for, when it comes to keyword searches. Once you have collected a sufficient list of keywords and key word phrases, you can then go about selecting and choosing the best among them and use those particular keywords for your web site.

Mar 4, 2009

Link Analysis Tools

Link Analysis Tools

Link analysis is a field which has lost a bit of its importance over the past couple years. Google has clamped down on paid links, web directories, and article directories. At the same time more people are blogging online, and more webmasters have become aware to the field of public relations and the value of links. These combine to make it is much harder to catch up with a competitor if you are trying to duplicate their link building strategy.

To some degree link analysis has moved away from PageRank and anchor text toward analyzing news in the marketplace and understanding why stories are spreading (link analysis is evolving into media analysis and public relations). Here are 6 of my favorite link analysis tools

  1. Yahoo! Site Explorer
  2. Google Blog Search
  3. Google Webmaster Central
  4. SEO Link Analysis
  5. Xenu Link Sleuth
  6. Advanced Link Manager

Yahoo! Site Explorer

Google and Microsoft generally show rather limited link data to the public. Yahoo! Site Explorer shows a much deeper and more representative sampling of inbound link data.

SEO Link Analysis

SEO Link Analysis is a free Firefox extension that adds link anchor text, PageRank score, and if a link is nofollowed to Yahoo! Site Explorer results.


Google Blogsearch

While Google tends to show a small sample of backlinks their Blogsearch is much better at detecting new links from blogs.


Google Webmaster Central

If you verify you own a website inside Google Webmaster Central, they will allow you to download a list of links pointing at your site. If you notice your rankings for a keyword are decent (but could be better) you could look through some of the people linking at your page and ask them to link at your page using more specific anchor text.


Xenu Link Sleuth

Xenu Link Sleuth crawls websites looking for broken links. This helps you keep your website maximally clean and functional by finding any broken internal links and any references to documents that are no longer online.


Advanced Link Manager

Advanced Link Manager is a paid software package starting at $199 which allows you to automatically track new backlinks obtained by competitors. It also crawls pages to find anchor text, if the link is using nofollow, and the PageRank score of the linking page.

We also created a free tool somewhat similar to Advanced Link Manager, named Backlink Analyzer. We have not updated our tool in a while though, and it is sorely in need of an update.

PPC Tools

PPC Tools

As search ad networks have become less transparent many PPC tools that did things like bid gap management have been rendered useless, while Google has built out a powerful AdWords API, an ad platform that allows you to bid per click or per conversion, and the Google AdWords Editor.

  1. Google AdWords Editor
  2. Google Website Optimizer
  3. Speed PPC
  4. PPC Tools suite
  5. PPC Analytics Tools

Google AdWords Editor

Google AdWords Editor is a free desktop tool advertisers can use to quickly edit their AdWords campaigns in bulk.

Google Website Optimizer

Google Website Optimizer is a free tool that makes it easy to test and improve your landing page and conversion process. Conversion Rate Experts offers a great overview article about how to take advantage of Website Optimizer and other conversion boosting tools.

Speed PPC

Speed PPC is a paid desktop software program which makes it easy to generate many cross referenced ad campaigns and landing pages for geo-targeted keyword combinations. At $497 Speed PPC is not cheap, but if it is beyond your means and you are a good programmer then Zipcode Guy offers a free database of U.S. cities and zip codes.

PPC Tool suite

My wife has build a free collection of useful web based PPC tools including a keyword list generator, a keyword list cleaner, a typo generator, a keyword wrapper, an ad group generator, and a ROI calculator.

  • The keyword list generator can be good for creating a quick list of keywords to plug into a PPC account to see what variations get the most search volume, then trim back the fat after you collect some account feedback data.
  • The keyword list cleaner is good for taking a dirty keyword list and making it useful. For example, lets say you have a list of MP3 player related keywords. This tool allows you to quickly filter out any keywords related to car and automotive MP3 players.
  • The ad group generator makes it easy to create ad groups for Google AdWords, Yahoo! Search Marketing, and Microsoft adCenter at the same time.

PPC Analytics Tools

  • Many big affiliate marketers create their own bid management software, but newer affiliates may find tracking tools like Optimize My Site and Tracking 202 useful.
  • Many enterprise level websites create custom bid management software for their company, but for enterprise level merchants who do not manage paid search in house, firms like Efficient Frontier can help with management. 

Jan 27, 2009

Keyword Research Tools

There are plenty of keyword tools available on the web, free and paid, to help in your research for developing the best keywords. I do recommend doing some homework on your words before any link building campaign. Your findings will dictate where to focus your link building efforts and your on page changes should be coordinated with this.

Keyword Discovery offers a paid search tool that is quite robust. They offer a free version which is limited in features but still quite usable and helpful. The paid version I have not used, but the features look like you can really get down and dirty with your research, looking at keyword by niche, showing the most popular searches in a particular industry.

http://www.keyworddiscovery.com/search.html

Wordtracker is the other popular paid keyword suggestion tool, and their free version is quite useful. you can only enter one word instead of comparing multiple words. Wordtracker offers a free trial for their more robust product for 7 days.

http://freekeywords.wordtracker.com/

SubmitExpress has a free tool called Keytracker that gauges the traffic of words, comparing those two popular keyword tools. You can enter more than one keyword, and It makes suggestions and shows the traffic for suggested or similar words as well.

http://keywords.submitexpress.com/keytracker.php

For keyword suggestions, Google offers its keyword tool for free, which will suggest keyword combinations based on either phrases you enter, or will analyze an existing web page. Then it does show the search traffic for those words. Another helpful free tool, recommended.

https://adwords.google.com/select/KeywordToolExternal

Microsoft’s Adcenter offers the Keyword Forecast tool. It will compare traffic for a few keywords in graph form, and does not require you to type in letters verifying you are a human, saving time for a quick look. The data seems to be about a year old though:

http://adlab.msn.com/Keyword-Forecast/Default.aspx

SEO Book offers a keyword research tool that compares data from many sources in one view. You can get lost in this one. You can only enter one word at a time, and it does not make the best suggestions compared to the others, but the info correlated is very robust, linking to the different keyword tools and search engines that it used to populate the info, showing the other suggestions and traffic from search engines. Fun toy.

http://tools.seobook.com/keyword-tools/seobook/

LIVE SEO NEWS © 2008.

TOPO