How Has The Farmer/Panda Update Affected SEO?


If you’ve been following the recent SEO news with any interest, you’ll be aware of Google’s recent algorithm update. Originally known as the ‘Farmer’ update (a term coined by Danny Sullivan, which quickly permeated into general usage), due to the fact that the algorithm update was aimed at so-called ‘content farms’. In the last few days however, Matt Cutts confirmed in an interview with Wired that the internal name for the update at the Google offices was the ‘Panda’ (or big Panda) update – apparantly named after a Google software engineer, who we can only assume is Navneet Panda.

The update appeared intially to be focused on ensuring that sites publishing non-original content didn’t outrank high quality sites with more in-depth, original content. Since the update was launched however (at first in the USA, with reports suggesting the update will hit the UK within weeks), it has become clear that the update was focused on any and all low-quality sites, particularly those who featured above-average levels of advertisements (even if they were Google Ads).

With the increasing amount of criticism over the quality of the results returned by Google, the search giant needed to do something to improve the quality of the sites in their index. The farmer update (I prefer this name so I’ll stick with it) is clearly a response to this level of criticism, and it means that sites deemed to have duplicate or low quality content are going to see big drops, if they haven’t already. Considering some of the current link-buidling tactics employed by SEO engineers, this could have a distinct knock-on effect to rankings and subsequently, SEO practises in general.

So how has the farmer update affected the SERPS in the UK and the US? How has the process of SEO been affected? How does Google qualify what constitutes a ‘low quality’ website? What signals are they taking into account? All of these are pertinent questions, particularly if you’ve witnessed substantial drops for your own (or client) websites.

Low Quality Signals

The interview with Wired gave us numerous valuable insights, which have confirmed a few of our suspicions regarding the signals the algorithm is paying the most attention to. Let’s have a look at a few of these signals in more detail:

We’ve suspected for a while that particularly aggressive advertisement placement would have an adverse effect on rankings in the long-run, whether these ads were Google Ads blocks, banner ads, text-link ads or affiliate link placements. This suspicion appears to have been correct, looking at the answers given by Cutts and Singhal in the Wired interview.

Cutts: There was an engineer who came up with a rigorous set of questions, everything from. “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” Questions along those lines.

Looking at some of the sites most affected by the update, we can see that the presence of excessive ads (defining what is classified as ‘excessive’ is difficult, so you’ll need to use your best judgement on this) does indeed appear to be used by Google’s algorithmic solution as a major indicator of a ‘low quality’ website. For a deeper analysis of the sites that were most affected, have a read of this great post on Search Engine Land.

Obviously excessive advertisements aren’t the only low quality signal, from what we can gather, ‘shallow’ content appears to be another important factor:

Singhal: So we did Caffeine in late 2009.  Our index grew so quickly, and we were just crawling at a much faster speed. When that happened, we basically got a lot of good fresh content, and some not so good. The problem had shifted from random gibberish, which the spam team had nicely taken care of, into somewhat more like written prose. But the content was shallow.

‘Shallow’ content is hard to judge algorithmically, which allows us to make a few educated guesses about how this is judged. Cutts and Singhal disclosed that they came up with a list of questions designed to help them judge a sites’ quality, including ‘would you trust this site with your credit card‘, ‘would you give medicine prescribed from this site to your kids‘, etc.

It’s a fair assumption then, that trust is an important element in the farmer update; judging such a factor algorithmically is exceptionally hard, but with a little bit of thought we can make some potentially pertinent judgements about the factors being looked at by the new algorithm update.

How Can Trust Be Judged By An Algorithm?

When we look at a website, we take a huge amount of information about the site we’re looking at to make reasonably quick judgements on how trustworthy we deem the site to be; is the site design up to date? Is the information being presented in-depth or do they simply skim over important subjects? Is the brand recognisable? Do they have a social media presence? Do they have reviews or testimonials from previous clients? Via what methods do they accept payment? Are they based in the country you’re browsing in? Do they have a phone number and physical address listed? What kind of sites do they link out to? Do they have any discernable customer services presence? What kind of products are they selling?

Whilst a lot of this will come down to personal judgement (making algorithmic judgement particularly tough), a lot of this could be broken down into reasonably simple indicators, which, when used in combination, could be used to judge a sites’ quality via an algorithmic approach. Let’s indulge in a little bit of conjecture, and discuss which elements could be looked at algorithmically to judge trustworthiness:

  • Brand indicators
    This is a whole blog post on its own (in fact I’ll write one next week on improving your websites’ brand signals), but it’s probably fair to say that on the whole, Google prefers to rank brands above unkown websites (due to the level of trustworthiness usually associated to recognisable brands). Things like having a strong social media presence (particularly on Twitter and Facebook) can act as a strong brand indicator, as typically, sites set up to make a quick buck rarely have a solid social media presence.
  • Social media links
    I’ve already written a post on how social media affects SEO, as well as producing an infographic on using social media to improve your overall SEO, but it’s worth mentioning here that a diverse profile of backlinks from social media accounts is likely to act as a strong indicator of popularity, and therefore a high level of trustworthiness.
  • Domain age and freshness of content
    Typically, a lot of spam sites and sites set up to sell affiliate products will be set up and left alone, with the owner usually spending their time and efforts on link-building (usually in a blackhat manner) in order to rank for relevant phrases. Therefore, it’s reasonable to assume that an older domain which regularly features fresh, relevant content would be considered more trustworthy that a newer domain with very little in the way of fresh content.
  • Physical address and phone number
    If the algorithm can easily locate a physical address and phone number, then that’s another tick in the trustworthy column. This is particularly true if the contact information can be linked to the same information on other trustworthy sources, such as Google Places, highly considered web directories, etc.
  • Website size and depth of content
    On the whole, sites set up with the sole purpose of ranking for keyterms related to their affiliate links are usually fairly light on content, and will rarely feature more than a couple of pages. Therefore, a site with a lot of depth is more likely to be considered trustworthy. Of course this isn’t always the case, as content farms tend to be large sites with a considerable amount of content, however, this content is usually pretty shallow in nature (especially in comparison with specialist sites on the same subjects) and they often contain a high amount of duplicate content and aggressive ad placement.
  • External links
    Generally speaking, most low quality content sites are built either to support advertisements or provide a platform to make affiliate sales (amongst other reasons). As a consequence of this, these low quality sites rarely link out, and when they do, it’s to a product or affiliate. Higher quality sites, such as in-depth blogs (like this one!) and the like tend to link out much more frequently, and tend to link to sites of an equal or higher quality.
  • Inbound links
    Inbound links have always been used to judge a websites authority, and a similar tactic will no doubt be used to help judge trust; the more trustworthy websites linking to you, the more trustworthy your site is likely to become.
  • Level of advertisement placement
    As we’ve already discussed, the level of advertisements placed on a website is likely to be being used as a key indicator of both trust and quality (due in the main to the number of low quality sites set up specifically to support advertisements).

I would suggest that the algorithm is likely looking at a combination of these elements to judge the level of trust and quality of a website, and using the relative position on a trust/quality scale to rank the site accordingly. Take the following quotes from Amit Singhal and Matt Cutts:

Cutts: … we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons …

Singhal: You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red.

Admittedly this extract is rather cryptic, but it suggests that there’s a potential scale of trustworthiness, on which the new algorithm update places a website. Regardless of how trustworthy you are as a business, if the algorithm places you at the red end of the scale, you’ll see considerable drops in ranking positions, and if you’re placed at the green end, you’ll likely see dramatic improvements in your positioning.

So what happens to the sites that are in the middle of the scale? A pertinent question considering that is where the majority of legitimate website owners will find themselves. Looking again at answers given during the Wired interview, there’s a particularly notable quote from Singhal:

Singhal: I won’t call out any site by name. However, our classifier that we built this time does a very good job of finding low-quality sites. It was more cautious with mixed-quality sites, because caution is important.

So essentially, if you’re at the bottom of the scale you’ll go down, if you’re at the top you’ll go up and if you’re in the middle of the scale (i.e. a ‘mixed quality’ site) then you’ll more than likely stay roughly where you are.

How Can We Use This To Improve Our SEO?

Ah yes, this is probably the bit you’re most interested in. For those who’re looking for ways to game the system for a quick pound, you’re not going to like this bit; if you want to improve your overall SEO, then stop looking for short cuts. Google wants to place quality, relevant sites in its index – and so it should, the happier the user with their SERPS, the more likely people are to continue to use Google above the increasing number of competitors in the search field. We for one welcome this update, as it means SEO’s are forced to provide quality to their clients, instead of gaming the system in order to rank poor quality client sites.

So with this in mind, let’s take a look at how you can improve your SEO with the farmer update in mind (and I’ll stop moralising about white hat and black hat practices … maybe).

  • It should go without saying that you should be aiming for high quality, in-depth and relevant content on your website – this should have been standard practise for SEO’s for a long time, but it’s worth noting that it’s more true now than it has ever been. Don’t expect to rank well if you’re trying to optimise poorly written, duplicate or thin content.
  • Get rid of aggressive banner ads from your website and ensure that any ads you decide to keep never take precedence over your content. If you’re using a thin content page to support ad blocks (regardless of their source, don’t be fooled into thinking just because they’re Google Ads you’ll be spared) and you expect that site to continue ranking for terms related to the content, you’re in for a bit of shock.
  • Try and improve your brand indicators – as I’ve already said I’ll provide a blog post on this in the next week to avoid this one being unreadably long, but briefly, you need to ensure your brand signals are strong. Keep your social media accounts populated and ensure they’re linked to and from your website, with the same brand information that’s featured on your website. Make sure all your directory submissions contain the same information, include an address and area-coded phone number and keep your information consistent across all platforms. If you have staff, then make sure they have social media accounts that mention that they work for you, and preferably link to you (particularly good for platforms like LinkedIn). Proper registration signals are also important, if you’re a limited company, include your company number, if you’re VAT registered, include your VA T registration number – both the footer and the contact page are good places for these. While I’m on the subject of contact pages, ensure you don’t just have a contact form, include your address, phone number and any other relevant contact information. Brand mentions will also be important, so when you’re link-building, include brand-name anchor text links as well as your keyphrase targets (also gives your backlink profile a more ‘natural’ feel). This might also result in organic searches for your brand name, which is another important brand indicator.
  • Look to improve your levels of trust – there’s some overlap here with improving your brand signals, which is good as it reduces the number of things you need to be concerned about. As we’ve already discussed, there are several elements that could be analysed algorithmically in order to gauge your sites’ trustworthiness, so start with focusing on those. Ensure you have a physical address and phone number, a robust ‘about’ page, regularly update your content and ensure that it’s in-depth and of a high quality, build high quality links, link out to high quality pages and cut down on your advertising.
  • Increase your social media efforts, substantially. Google (and Bing while we’re at it) have confirmed that social media links can improve your rankings and website authority, and it’s not hard to understand why. A large number of links from social media accounts (with particular reference to Twitter and Facebook), especially if they’re from a diverse range of high authority social media accounts, is a strong indicator to search engines that your content is high quality, popular and that your site is trustworthy.
  • Forget short-cuts and black-hat practices. If you’re only interested in the short game, then you’re going to seriously struggle, particularly as the Google team continue to tweak their algorithm (which they will). The problem with employing short cuts designed to game the Google algorithm is that one change can undo all your work in a matter of minutes. Instead, don’t supply content for Google, provide it for your visitors – when you’re putting together your content, don’t even consider Google – sure, they’re the market leader now, but that may not always be the case. Build your site and your business around quality of content, product and/or service. Play the long game and you’re far more likely to see benefits in organic search, particularly when these efforts are combined with legitimate and high-quality SEO practices.

*I’ve intentionally avoided discussing article distribution platforms and article spinning in this article, as I plan to write a seperate post on this very issue. If you’ve got any specific questions on this, let me know in the next few days and I’ll try to include them. If you want a quick answer on whether or not they’re still worth using for SEO, the answer is a very complicated yes (just stay away from Ezine Articles and Articlebase).

Conclusion

It’s been an interesting update from Google, and one of the biggest in the last couple of years (Matt Cutts suggested it affected around 12% of queries, which is a huge amount of searches). Generally, I think it’s a good move from Google and one that will hopefully improve both the quality of organic search results and the practices of SEO engineers. A good friend of mine recently said to me ‘I guess it’s time to go back to the old school methods, like creative link building and high quality link-bait’. He’s spot on, and to me that’s a good thing, as relevancy and quality should never have left the forefront of the SEO engineers’ mind.

Don’t expect this to be the end of this update either, Google are constantly tweaking their algorithm and this update will be no different. I’d expect to see some more changes in the SERPS in the coming months, and I wouldn’t be surprised to see a few more unhappy webmasters and black-hat SEO’s in the near future.

—————————

Post by John Pring – Follow him on Twitter (@john_pring)

1 Comment

  • In other words, as I always siad: “write decent shit to begin with!” I rest my case m’lud. Also – Micah Richards – spawny git or not?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>