Scroll to top

Google Panda update: the end of poor quality content?


charlie - 6th December 2011 - 0 comments

Google’s Panda update signals the end of poor quality content on the web. Google is conscious of improving people’s browsing experience. In an online world where content is king, it’s become more important to boost the visibility of sites boasting better quality content, and this is the premise behind Google’s Panda update.

How to survive post-Panda? In short, your website needs to boast quality, authoritative content.

Why ‘Panda’?

Aside from their cuteness, pandas are known for cutting down and eating the low-level forest they inhabit. It is this characteristic that inspired the name behind Google’s latest update – its central goal is to limit the visibility of low level, poor quality content – making it as invisible as possible, in fact – and boost that of better quality content.

Lower quality, or ‘filler’ content, is not something search engines are keen on. Why? Because of the lack of focus and lack of useful information – combined, these negative attributes only pollute search engines’ result pages, which may, if enough internet users notice, damage their reputations.

While the panda update represents good news in theory, it isn’t always plain sailing. Because so much content is being ‘cut down’, some users are being put off – and in some cases prevented – from discovering the high quality content they require.

So, how do search engines determine the quality of a website’s content?

Firstly, consider your own browsing experiences. How many times has Google brought up a seemingly-relevant page in its search results, only for you to click through to discover sub-standard, irrelevant or ill thought-out copy?

There’s a simple reason for this – pre-panda, too many pages were designed to attract a user’s attention and earn a click – not provide the content the reader required. More often than not, these low-quality pages featured an advert, which often served to frustrate the user further.

Central to a search engine’s functionality is indexing, which essentially involves sifting through large volumes of data before compiling a list of relevant, accurate pages. By recording the returned results, Google is able – through an algorithm – to work out how useful and relevant the results are in relation to the searched term.

While this system proved useful, SEO aficionados soon learnt to trick search engines, principally through keywords. The more a keyword appeared in a page, the more that page appeared relevant – at first glance, at least – and focussed. Google couldn’t be fooled completely, though – if a keyword was included too many times, it would be regarded as ‘keyword stuffing’.

While a number of organisations have got away with this practice, it is not something that search engines have been pleased about – essentially because it compromises how effective they are. A wider consequence is that keyword stuffing has given online content a bad name – something that can be gauged by asking yourself how many times you have clicked the ‘back’ button after visiting a listed page.

Fortunately, there are quite a few strings to a search engine’s bow. Google, for example, is able to record data – especially bounce rates – to determine concrete information on how long people spend on certain pages. If a site has a particularly high bounce rate, a search engine can comfortably assume that the page isn’t particularly relevant. The result? It falls down the listings.

The flip side of the coin concerns sites with low bounce rates, with people spending longer browsing their pages. The likelihood is that the copy will have been read and the viewer secured the information they needed. Search engines recognise this, and will rank the page higher.

Pages such as these are rewarded for recognising the true nature of the internet – as a communication platform, not simply a means of trying to attract the attention of potential targets. Search engines frown upon businesses – and there have certainly been plenty of them – that try to use the internet as a fast track to success. While it can be used as a successful promotional tool, this in itself won’t guarantee success – and this is where quality content comes into play.

Relevant, thought-out content has two main benefits – firstly, you’re more likely to keep the attention of targets and earn their custom; secondly, readers may share articles with their contacts via social media platforms. The latter has a particularly positive knock-on effect, in that people are more likely to trust content they receive from their friends rather than from a seemingly anonymous marketing campaign.

A quality content campaign involves something of a seismic shift – rather than a business prioritising its own needs and goals, it prioritises that of its customers. This involves not only relevant, interesting content, but an overall good customer experience.

To conclude

The panda update exists as a means of determining how high quality a page’s content is. This is achieved by via quantifiable metrics – particularly the length of time a user spends on a page and those all-important bounce rates.

The implication for SEO strategists is that what has worked before is unlikely to work again. Businesses looking to improve their online presence – in the wake of panda – have therefore been forced to re-think their approach. While this may mean minor tweaks to content, some copy is having to be entirely re-written.

Forward-thinking businesses are also aware that updates such as panda are going to be a common theme in the years to come – which points to an interesting future for quality content.

Related posts