The Google Florida Update
I got started in the search field in 2003, and one of the things that helped get my name on the map was when I wrote about the November 14th Google Florida update in a cheeky article titled Google Sells Christmas [1]. To this day many are not certain exactly what Google changed back then, but the algorithm update seemed to hit a lot of low level SEO techniques. Many pages that exhibited the following characteristics simply disappeared from the search results
repetitive inbound anchor text with little diversity
heavy repetition of the keyword phrase in the page title and on the page
words is a phrase exhibiting close proximity with few occurrences of the keywords spread apart
a lack of related/supporting vocabulary in the page copy
The Google Florida update was the first update that made SEO complicated enough to where most people could not figure out how to do it. Before that update all you needed to do was buy and/or trade links with your target keyword in the link anchor text, and after enough repetition you stood a good chance of ranking.
Google Austin, Other Filters/Penalties/Updates/etc.
In the years since Google has worked on creating other filters and penalties. At one point they tried to stop artificial anchor text manipulation so much that they accidentally filtered out some brands for their official names [2].
The algorithms have got so complex on some fronts that Google engineers do not even know about some of the filters/penalties/bugs (the difference between the 3 labels often being an issue of semantics). In December 2007, a lot of pages that ranked #1 suddenly ended up ranking no better than position #6 [3] for their core target keyword (and many related keywords). When questioned about this, Matt Cutts denied the problem until after he said they had already fixed it. [4]
When Barry asked me about "position 6" in late December, I said that I didn't know of anything that would cause that. But about a week or so after that, my attention was brought to something that could exhibit that behavior. We're in the process of changing the behavior; I think the change is live at some datacenters already and will be live at most data centers in the next few weeks.
Recent Structural Changes to the Search Results
Google helped change the structure of the web in January 2005 when they proposed a link rel=nofollow tag [5]. Originally it was said to stop blog spam, but by September of the same year, Matt Cutts changed his tune to where you were considered a spammer if you were buying links without using rel=nofollow on them. Matt Cutts documented some of his repeated warnings on the Google Webmaster Central blog. [6]
A bunch of allegedly "social" websites have adopted the use of the nofollow tag, [7] turning their users into digital share-croppers [8] and eroding the link value [9] that came as a part of being a well known publisher who created link-worthy content.
In May of 2007 Google rolled out Universal search [10], which mixes in select content from vertical search databases directly into the organic search results. This promoted
Google News
Youtube videos (and other video content)
Google Product Search
Google Maps/Local
select other Google verticals, like Google Books
These 3 moves (rel=nofollow, social media, and universal search), coupled with over 10,000 remote quality raters [11], has made it much harder to manipulate the search results quickly and cheaply unless you have a legitimate well trusted site that many people vouch for. (And it does not hurt to have spent a couple hours reading their 2003, 2005, and 2007 remote quality guidelines that were leaked into the SEO industry. [12]
Tracking Users Limits Need for "Random" Walk
The PageRank model is an algorithm built on a random walk of links on the web graph. But if you have enough usage data, you may not need to base your view of the web on that perspective since you can use actual surfing data to help influence the search results. Microsoft has done research on this concept, under the name of BrowseRank. [13] In Internet Explorer 8 usage data is sent to Microsoft by default.
Google's Chrome browser phones home [14] and Google also has the ability to track people (and how they interact with content) through Google Accounts, Google Analytics, Google AdSense, DoubleClick, Google AdWords, Google Reader, iGoogle, Feedburner, and Youtube.
Yesterday we launched a well received linkbait, and the same day our rankings for our most valuable keywords were lifted in both Live and Google, part of that may have been the new links, but I would be willing to bet some of it was caused from 10,000's of users finding their way to our site.
Google's Eric Schmidt Offers Great SEO Advice
If you ask Matt Cutts what big SEO changes are coming up he will tell you "make great content" and so on...never wanting to reveal the weaknesses of their search algorithms. Eric Schmidt, on the