Tuesday, May 10, 2011

SEO Understanding the Simulation

To understand the underlying principles of SEO , we need to look at what the internet was like and how websites became popular before Google.
When the first webpages were created, there weren't any search engines. The only way your site would be found was by telling people directly. You would build your website, tell your friends and coworkers the url, they would visit the site and if they liked it they would tell their friends and coworkers. As the number of websites grew people figured out it would be good to have some indexes to help sort through all these websites, and so the first directories were started. These directories were found in the same was as any other websites, by word of mouth. Adding your site to a directory, made it a little easier for a website to become popular, so webmasters started submitting their sites to the directories. As the directories became bigger, some people figured out ways to make them give more relevant results, and the idea of search engines was born. By 1995 Yahoo had figured out how to give somewhat relevant results, and was on it's way to becoming
the most used search engine of that era.
The first search engines weren't all that much different than directories in how the sites were indexed. Mostly just basic algorithms were used to index the sites by category, but within their categories they were to a large extent still listed in the order they were found. As the number of sites grew this became less effective and so the idea of SEO came into being. The original SEO was simply on page optimization, utilizing keywords to let the SE know the categories, and the more the keywords were used, the higher the site would place in the SERPs. Number of keywords, and number of pages on the website were the largest factors. The true popularity of a site had very little if any effect on a sites placement in the SERPs of the early search engines.
Let's use Joe's Really Cool Website as an example.
Here's how a truly popular site would typically happen.
Joe's webmaster would make a site based on an interesting topic. He would try out some really cool html tricks, and provide some interesting content.
Joe would then email a few of his friends telling them to visit his website. he would also submit his site to directories and the early search engines.
Those friends would visit the site, decide they liked it and then tell their friends about it as well.
Some of the people finding the site would also have webpages, so they would put a link to Joe's site on their own page. Joe would also look for other sites to trade links with.
People visiting these other websites, the directories, and the search engines would follow these links to find Joe's website.
As more people visited the site, more emails would be sent out, and more websites would be including links to Joe's page.
Soon Joe's Really Cool Website was being passed around and linked to so much that just about everyone knows about it.
The reason for all the backlinking was to get real visitors to click on those links, it wasn't done to get placement in the SE. A listing in Yahoo was seen as a very important backlink, but still it was just another backlink among many to get traffic. Even with all the backlinks, Joe's Really Cool Website might not have been the top listing in the search engines. The sites placement in the search engine wasn't related to it's popularity. The sites listed in the SERPs, didn't correspond to which sites were becoming the most popular, but having a listing in the SERP would give a site more traffic, which could lead to the popularity process starting for a site that had good placement. This is why keyword stuffing became so popular. At the time it was the most effective SEO technique that could be done.
As the number of sites grew, this method of indexing sites was becoming unruly. The downside of course was that webmasters started stuffing keywords unrelated to the topic of the site, just to get spam listings in the search engines. The listings returned for searches were becoming less and less relevant. A better way was needed for the search engines to determine which sites were becoming popular on their own because people felt they were the most relevant. The idea of using the number of backlinks to a site to determine which sites were more popular was the next step in providing better listings in the SERPs, but since a webmaster could just go out and start trading backlinks on a lot of sites themselves, it wasn't the best indication of which sites were standing on their own.
To find a solution to this, a student at Stanford by the name of Larry Page developed an algorithm to determine not just which site had the most backlinks, but which ones were linking together based on the natural growth that happens independently without the search engines. The Page Rank algorithm was designed to find the footprint of a naturally occuring popular website based on the theory that those sites must be the most relevant sites that people want to find. Using the Page Rank algorithm Larry Page and Sergey Brin were able to design an improved search engine which they named Google. As it turned out they were right, the results that Google was providing by using this popularity based ranking system were found to be much more relevant to the people using the search engine.
This new approach totally changed the way webmasters needed to approach SEO . Keyword stuffing was no longer effective. Google was basing it's index on popularity, not just keywords. SEO experts soon figured out that only enough keywords to let the SE know what the site's topic were needed, any more than that was getting the site penalyzed. How popular the site was had become more important, so simulating popularity became the focus of SEO .
To get a better understanding of the background behind the foundation of Googles methods, you can get a good start by searching for Page Rank, Larry Page, Sergey Brin, and Google on Wikipedia.
The goal of modern SEO is to create the appearance that your site is a naturally occurring popular website. To do that you need backlinks. Not just a lot of backlinks, but the right natural structure of backlinks. Just having a lot of backlinks only lets the SE know that the webmaster is good at getting a lot of backlinks. The PR algorithms have changed quite a bit since Google's startup, and even become less of a factor than how they were originally used, but the SE is still looing for the same thing. It's looking for a site in which the backlinks appear to have come from interested visitors linking to the site on their own. The search engine is looking for that popular viral type site like Joe's Really Cool Website.
There should be a very large number of random personal type sites that are linking to each other, also linking to your site as well as a few others. These sites should also be linking to some directory sites in which your site is also listed. There should be many people bookmarking your sites. Your site would have been noticed by some authority sites by then, so there should be many links on those sites. Through the whole process there should be a lot of independent islands of small scale interlinking spread over a very large scale going very deep. This is the way the backlinking to a popular site naturally occurs, so this is the type of backlinking you are trying to simulate.
As your building your backlinks, you should remember that this naturally occuring popularity is what you're trying to simulate. Using whitehat and blackhat methods you can artificially create the same type of backlinks that would have occured naturally if you really had a great site that real people find relevant. The better your able to simulate this occurance, the larger the chances of success. If you're site actually does have good content that will interest real visitors, the simulation is likely to just jump start the natural process. With a good solid SEO strategy, you can skip months or even years of natural growth. In any event, if you make the simulation good enough, the google algorithms will believe it and give you top position in the SERP. This alone can be enough to drive masive amounts of traffic to your site.
Understanding the simulation you're trying to artificially create is the key to making your site successful.

No comments:

Post a Comment