January 4, 2020

Search Engine Ranking Factors Update 2011

SEOMoz have updated their search engine ranking factors compilation which they publish every 2 years. We think this is essential reading for everyone in digital marketing, not just Sharp SEO Services specialists since it shows how to best get success in SEO and what to target your agency on.  In 2009 the top 5 ranking factors were:
  1. Keyword focussed anchor text from external links
  2. External link popularity
  3. Diversity of external link sources
  4. Keyword use anywhere in title tags
  5. Trustworthiness of Domain based on Link Distance from Trusted Domains
Here are the key charts of ranking factors from the report [Editors note: I use this to show that page Keywords are relatively unimportant compared to anchor text and domain authority] :
While early days in the research the 2011 update from SEOMOZ is formed from surveying 132 SEO professionals and correlating data from 10,000+ keywords. Checkout slides 12 & 13 on the slideshare presentation below:
The 2011 analysis loses some of the clarity of the 2009 analysis. Key differences to take note of in 2011 are:
  • Introduction of page level social factors (is your content shareable & good enough for people to want to share?)
  • The overall value of external links has shrunk (though new fields have been added which may skew data)
  • Domain level brand metrics are introduced as a key factor'
It's that magical time of year when all of us who foolishly assumed the mantle of clairvoyance last December check up on our abilities and repeat our arrogant presumption again. Not surprisingly, something compels me to try again, despite the odds, but I am feeling a bit whimsical tonight, so let's make a game out of the prediction practice. 
For each prediction (mine and others), we can grade using the following points system:
  • Spot On (+2) - when a prediction hits the nail on the head and the primary criteria are fulfilled
  • Partially Accurate (+1) - predictions that are in the area, but are somewhat different than reality
  • Not Completely Wrong (-1) - those that landed near the truth, but couldn't be called "correct" in any real sense
  • Off the Mark (-2) - guesses which didn't come close
The rule is - if the score is lower than +1, the blogger/industry leader/author isn't allowed to make predictions for the coming year.
So let's mark up my 8 predictions from last year and see whether new predictions are permitted:
  1. This Real Time Search Thing is Outta Here (+1) - technically, it's still around, though far less prevalent. I said "In 2010, I think this fades away. Perhaps not entirely, but we won't be seeing it for nearly as many queries with the prevalence we do today," which, on the scoring scale, probably deserves a "partially accurate."
  2. Twitter's Link Graph is the Real Deal (+2) - my guess seems oddly prescient when compared to Google + Bing's interview a few weeks back. "Google's not going to just take raw number of tweets or re-tweets. I think we're already seeing the relevance and reputation calculations in their decisions of which tweets and sources to show in the real-time results, and I expect that algorithms/metrics like PageRank, TrustRank, etc. will find their way into how Google uses the real-time data." I wonder if my luck can last.
  3. Personalized Search is Here to Stay (-1) - The title of this guess would make you think I'd got it right, but the substance is lacking. I noted that "If it's proven that you can get organic benefits by attracting PPC clickthrough, this may be the new "paid inclusion" for 2010, and could drive bid prices up massively as companies compete not only for paid listing clicks, but for the chance to earn "organic" positioning as well." Personalization bias didn't go towards brand exposure, and it actually hasn't got much stronger (apart from the localization element, which I didn't predict). Technically, it's still around, but it didn't become the juggernaut I thought it would.
  4. It's Going to Be a Two-Engine 80/20 World (+2) - Google's market share of web searches sending external traffic is likely very close to this (although Comscore reports only 66%, those numbers are heavily biased due to non-web-search "search" activity counted in the figures). A far better source would be something like StatCounter's referral data from the 15 billion pageviews/month on 3 million+ websites, which reports 81.88% for Google, and ~18% for Bing/Yahoo!. Given that Ask.com, Cuil and Yahoo! all folded their search operations this year, and Facebook/Twitter/Somebody Else Big hasn't entered the field, I'm giving this a "Spot On."
  5. Site Explorer and Linkdomain Will Disappear (+1) - Linkdomain is gone (at least in the US, and soon in most other countries), but it appears we'll still have until 2012(ish) with Site Explorer, so I'm giving this a (possibly slightly generous) rating of "partially accurate."
  6. SEO Spending Will Rise Dramatically (-1) - This one depends on the meaning of "dramatically." SEMPO's data suggests that 43% of marketers "expect" to spend more on SEO, but this is down 2% from 2009's survey. SEOmoz's own survey unfortunately doesn't compare apples to apples (we haven't asked the same question multiple years in a row and thus can't compare well). As of now, no new sources have come forward with data we're aware of (Forrester + eMarketer being the usual suspects). Thus, I'll give a "not completely wrong," since we really don't know.
  7. 2010 is the Year of Conversion Rate Optimization (-1) - Again, I'm going to say this was "not completely wrong" but it's also very tough to measure. We've had more speakers on CRO at search and marketing events of all varieties. Anecdotal reports would indicate CRO is becoming a more common and popular practice for organic marketers, but without solid numbers, it's hard to know. We can presume, however, that if there aren't lots of studies and data reports touting it, this probably wasn't "the year."
  8. More Queries Will Send Less Traffic (-1) - Given the launch of Google Instant, the personalization and localization of results, increased ranking inconsistency and more universal/vertical results in the SERPs, I'm going to say this is possibly near the mark, but not definitively correct. Google Instant, in particular, appears not to have moved the needle much on search demand and queries sending traffic. In fact, the only reason this is "not completely wrong" is due to my clever non-prediction of how many queries would send how much less traffic. :-)
Tallying the numbers, I'm seeing +6 and -4, for a total of +2, which means new predictions for 2011. I also invite you to analyze some of the many lists of predictions for this past year here. If my calculations are correct, Mashable and TechCrunch are out of the predictions business (the latter just barely) while the New York Times' Bits Blog should continue (though, like me, they made some pretty pansy predictions).
Without further ado, my predictions for SEO in 2011:

#1: Someone Proves (or a Search Engine Confirms) that Clicks/Visits Influence Rankings

I'm taking a chance on this one, but I've been hearing from more and more SEOs that there's some correlation between earning clicks and moving up in the rankings. In 2011, we'll get confirmation, either through testing or an admission from an engine that click-through-rate from the SERPs, visit count outside of search (or diversity of sources), or other usage-based data is in the ranking algorithm (or a method they use to help ID spam).
Clicks/Visits Matter

#2: Google Local/Maps Adds Filters and Sorting

The big reason Yelp is so much better than Google Maps/Local for finding a good local "place" isn't just the reviews (which Google aggregates from Yelp anyway). It's the filters that let me sort by features/pricing/proximity/open status/etc. Google's been playing the silly game of forcing users to choose search queries to enable rough, imperfect filtering, but 2011 is going to see the search engine shift to a model that allows at least some important filters/feature-selection.
Local Filters in Yelp

#3: Social Search Will Rise

There's power in social media search, and Google/Bing's efforts to date have been lackluster at best. I suspect in 2011, we'll see the nascent beginning of search that leverages Twitter/Facebook/LinkedIn connections to find results from your friends. It's possible this will start niche-based only (search articles your friends have shared, ala Trunk.ly), but it could also be broader - possibly something from Facebook or Twitter themselves.

#4: Rank Tracking Will Be Possibly Through the Referral String

Google's been slowly growing the percent of queries that contain the numeric position of the result in the referral data. Given how much this information means to marketers (even those who realize it's frequently not telling the whole story), and how much automated scraping/requests goes through each day, I'd venture to guess that Google will increase this further and maybe even add some support for it in GA (why force your engine to work harder and your impressions counts to suffer unnecessarily?)

#5: Mobile Will Have a Negligible Effect on Search/SEO

For years, I've heard the prognostication that SEO Services and search are going to be flipped on their heads once mobile query usage takes off. I'll boldly predict that not only will mobile usage of search NOT skyrocket in 2011 on the long-awaited J-curve, but that the mobile and normal web browsing experiences will continue to merge toward a single experience, thus negating much of the need for mobile-specific sites and SEO. They'll always be mobile-related marketing opportunities in games and local (though these are hardly limited to mobile devices), but  mobile SEO will pretty much just be "SEO."
Mobile SERPs
_

#6: Software Will Become an SEO Standard

For the decade I've been in SEO, software and tools have always been a "nice-to-have" and not a "must-have" (with the possible exception of web analytics). In 2011, I see several SEO software companies growing to critical mass based on the market's demand, possibly including: Raven Tools, Conductor's Searchlight, Brightedge, SearchMetrics, RankAbove, DIYSEO and/or GinzaMetrics. Hubspot, while more of a CMS/holistic marketing tool, will also likely fit in this group as their SEO offerings get stronger. Oh, and SEOmoz's Web App could do pretty well, too :-)

#7: We'll Start to Move Away from the Title "SEO" to Something More All-Inclusive

For years, I've prided myself being an SEO and embraced the title, the community, the positives and the negatives that come with it. But with the search engines expanding so far afield in the signals they consider and the verticals/media types they include, I have to face facts - SEO today calls for much more of a talented generalist than a pointed specialist. We need to be savvy about and good at so many facets of organic web marketing that to call us "SEOs" is less empowering and more limiting than in the past.
SEO Skills

Now I'd love to hear some of your predictions for 2011 and see who's earned scores predicting 2010 that gives them the right to guess about 2011.
p.s. I didn't take the obvious "Google's going to crack down on more link spam" or "Social's going to be even more important" prediction gimmes, because I just don't think I'd respect myself tomorrow morning.

January 1, 2020

Google Updates: A Brief History of SEO from 2000-2010

Here’s an updated list of Google algorithm updates as of 2019.

The Search Engine Optimisation (SEO) industry has changed tremendously in the last ten years. (If you want to see how our company evolved in the 20 years since our 1996 founding, take a stroll down memory lane with our SEO company history infographic.) The Sharp SEO Services is quick-moving, and ever since Matt Cutts stated Google makes 300 to 400 changes to the algorithm each year, it’s evidently clear that rankings change quite a bit, for a variety of reasons. For competitive queries like car insurance you can see changes on nearly a daily basis, as Google continues to chase relevance for users. Over the past ten years, some of these changes have had disruptive impacts on not only the top ranking results in the SERPs, but also traffic to the websites behind these and the businesses behind these websites. This post will cover some of the significant Google updates that have occurred since 2000.

A Brief History of SEO

Before there was even a word for Search Engine Optimisation, webmasters would discuss their strategies to get websites ranking on forums. Webmaster World has a great post detailing the major events in SEO prior to 2000, dating back all the way to 1995 – the era where search engines were akin to the Yellow Pages with AAA style listings at the top. From then on having your websites rank well has been a constant cat-and-mouse between webmasters and the search engines: SEO was born.

At the time people were not even calling it “SEO”, but people realised that they could manipulate the rankings and shared their strategies online (and presumably kept many secret as well).  I highly recommend that you check it out the post because as it is a fascinating read and gives you a really good appreciation for how far we have come as an industry.

Fast forward to 2000 when Google broke into the scene with its new PageRank algorithm, and it became clear to webmasters around the world had to adapt and change.

Google Updates from 2000 to 2010

2000-2003

Between 2000 and 2003 PageRank would generally be updated monthly and rankings would fluctuate accordingly. I remember hearing Todd Friesen tell a story about long sleepless nights waiting for the updates to arrive, then panicking (and refreshing like crazy) until the new rankings resolved. Webmasters would post their findings on Webmaster World, and once the updates were complete they knew it was about a month until the next set of updates arrived. It was in 2003 when the people at Webmaster Word started naming the updates after hurricanes, with Boston, Cassandra, Dominic, Esmeralda, all the way to the infamous Florida update.

During this time SEO was pretty spammy. It was all about getting high PageRank links, or even just links from wherever and whoever you could. Footer links on high PR pages would catapult you to the top, and link farms to throw PR to your websites were easy to deploy and effective.

Florida Update – November 2003

The Florida update was the first “game changer” update as many top rankings sites simply disappeared from the rankings. Sheer panic erupted across the board because it seemed that Google finally cracked down on the manipulative tactics being used to get pages to rank.

Barry Lloyd and many others theorised that the engineers at Google invented a way to detect pages that have been over-optimised and simply removed them from the index. Ian Lurie recalls that the sites that didn’t disappear were the content rich, natural ones with good, well-written content.

Brandy Update – February 2004

The Brandy Update emphasised Latent Semantic Indexing – the idea of using synonyms on your website. Someone by the name of “GoogleGuy” (aka Matt Cutts) made an interesting point just before the update that webmasters who do not “think about search engines” generally do not bother to include word variants – and spammers can easily create doorway pages full of word variants. What does that mean? You can’t outsmart Google just by throwing keyword variations into your text – it has to be natural.

LSI is not simply opening up a thesaurus and replacing every X instance of “dog” with “Canine”. As Google crawls and indexes billions of pages, it gets a pretty good idea about word associations and what words should appear on the pages. If you want to learn a bit more on the subject, read our post on Latent Semantic Indexing.

Alex Walker made a post over on sitepoint about the Brandy Update, and he highlighted the five important changes he thought were brought with the update: an increase in index size; Latent Semantic Indexing (using synonyms); grouping websites in neighbourhoods; de-emphasising on-page elements like <h1> and <b>.

Allegra Update – February 2005

In a press release, 6S Marketing reported that Allegra was a remedy to the “Sandbox Effect” that many websites were facing since 2004, and there were many posts over as Webmaster World that support this theory. The forums over at Search Engine Watchmentioned that LSI factors could have been emphasised, and these were the two main changes reported about the update.

Bourbon Update – May 2005

The Bourbon Update is another big change to the algorithm that was focused on getting rid of spam from the index. There was an article on COMMbits stating that its purpose was to tackle duplicate content, non-thematic linking, low quality reciprocal links, and fraternal linking. In a post on Webmaster World Matt Cutts has a very long discussion about general updates at the time, and mentioned re-inclusions to sites removed from the index, and there were additional posts about breaking out of the sandbox.

Jagger 1, Jagger 2, and Jagger 3 – October 2005 to November 2005

The Jagger updates, like most of the major updates, were released with the intention of dealing with an increasing amount of webspam. The sites tackled were scraper sites, AdSense directory sites, pages using deceptive CSS techniques, and again dealing with reciprocal linking abuse.

Before the update, Matt Cutts issued a warning about hidden text on sites, and after Jagger he invited webmasters that thought their site was mistakenly removed for hidden text or text links to ask for a re-inclusion. Google has several patents relating to relevancy, and they may have bumped up their importance in the algorithm. Here is a post that outlines one author’s ideas on the changed ranking factors.

There was an interesting discussion over at Tech Patterns that some webmasters noticed their rankings did not fluctuate if they used white hat tactics, and that most of the plummets in rankings were from sites reciprocal-linking and spamming their way to the top. Of course, this is one person’s opinion on one forum, but it seems in line with the ultimate goal these updates: Google wants to deliver clean, non-spammy, and useful results.

Personalized Results – June 2005

In June, 2005 Google made their first mass-release of Personalized Results. The purpose of this change was to influence the results shown to a user logged into their Google Account by what websites they have visited and through their previous searches. You can read more about this change over at Wikipedia.

BigDaddy – December 2005

The BigDaddy update was a software upgrade of the GoogleBot and affected the way Google dealt with links. Matt Cutts said that the kinds of sites effected by the index “had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling.”  Web Workshop reported that this is when outbound links became an important factor, as linking to spammy sites like Omega 3 fish oil or Ringtones could affect your rankings. Matt also mentioned the importance of having relevant links pointing to your site in order for Google to crawl more pages on your site.

Reducing the Impact of Googlebombs – Jan 2007

We all have seen Googlebombs before – where a group of people try to influence the rankings for an obscure term as a joke, such as the results for “Weapons of Massive Destruction” returning a fake 404 error page – and in January 2007 Google announced that they tweaked the algorithm to detect them.

Universal  Search – May 2007

The impact of Universal Search on Search Engine Optimisation was that the SERPs integrated material from Google’s multiple channels and opened “back-doors” to the first page. If you could get a video ranking, it could sneak to the top of the results pages of very competitive keywords. The change brought together images, videos, news, maps, and websites into a single set of results.

Real Time Search – December 2009

Real Time Search was about including fresh, topical content in the SERPs. If there is an earthquake or other major event, it makes sense that queries about that location bring up links to relevant news articles even if they did not have long-standing high quality links or other time-related signals of authority. RTS brought with it the idea of Query Deserves Freshness (QDF), as Google had to determine what queries need frequent updating. Here’s a video from Google about the change.

Vince – February 2009

The Vince update, named after the Google engineer who invented it, is known as the Google update that boosted the rankings of popular brands. Matt Cutts went on the record saying that the change wasn’t about boosting brands per say, but rather putting more weight on domain authority, trust, and reputation (which big brands generally have).  This change really highlighted the importance of working towards establishing credibility and building an authoritative domain.

Caffeine – August 2009

On August 10, 2009 Google began inviting people to test out their “next generation infrastructure”. It finished rolling out in June, 2010, and touted both “50 percent fresher results” and the largest index ever collected.  Google went from having layers of their index each updating at a different rate (each requiring an entire re-crawl of the web before updates could be rolled-out), to smaller portions that would update on a continuous basis.

May 2010 – Mayday

At the end of April / start of May Google made a significant change to its algorithm, looking for higher quality sites to surface for long tail queries. Search Engine Land reported that the sites most hit by the change were those with many product pages without strong links pointing to them. In a Google Webmaster Help video, Matt Cutts described the change, and suggested ways that people could improve the quality of their sites by asking themselves the flowing questions: “What sort of things can I do in terms of adding great content… [and] do people consider me an authority?”.

Instant Search – September 2010

Google Instant is the most recent change to the search engine, and it is all about updating results as you type. There was an abundance of speculation that this would have huge effects on search engine optimization, but so far those appear to be exaggerated. You can read our post on Google Instant Search for more information.

In November Instant was updated with “Instant Preview” showing users an image of the page before they click through to the link.

“Decor My Eyes” Update – December 2010

A story erupted around the web this week about a website that was using bad customer service to get incoming links, which in turned supported its rankings for many competitive terms. Today, Google announced that it tweaked its algorithm to respond to cases like this and try to prevent them from happening in the future.

Conclusions

After looking at the changes Google has made in the past ten years, I think the biggest conclusion to draw is that you need flexibility in your approach and definitely should not focus on a specific ranking factor. The algorithm is tweaked hundreds of times each year, and what works today might not work tomorrow or in extreme cases, even be considered spam. It’s clear from Google’s relentless pursuit to remove spam from the index that your efforts really should be producing quality content and establishing credibility and authority by attracting natural, relevant links from authoritative sites SHARP SEO SERVICES, SEO COMPANY, SEO SERVICES PAKISTAN, SEO SERVICES USA, SEO SERVICES UK. At Bruce Clay we break these down into Technical, Expertness and Content.

What do you think? Have I missed any big updates? Anything you would like to add? I would love to hear your thoughts in the comments.