Over the past few years content has become a lucrative commodity on the web. However, as several big companies started to make money from producing vast quantities of content, so the web started to fill with (literally) millions of sub-standard articles designed purely to sit at the top of Google’s search results.

So bad is the trash content epidemic that a slew of complaints started to hit Google. And not just from tech journals and geek forums. When the New York Times runs a high-profile article on the quality of your search results, you know that things are starting to get serious.

On top of that, Google is under increasing pressure from search engines like Bing, as well as more sophisticated social search engines and minor players such as DuckDuckGo and Blekko.

On that note, an announcement appeared on the official Google blog on the 24th February.

The post indicated that poor quality content that bears similarities to other content (such as “scraped” articles) or doesn’t add value would somehow be pushed down the rankings in favour of more valuable material. The implication is that so-called “content-farm” sites will get bumped from the rankings.

Usually, when Google tweaks their search algorithm (which they do up to 500 times per year) there’s no great announcement or fanfare. However, according to Matt Cutts, head of the Google web spam team, this current announcement affects up to 11.8% of all search results. It’s a big change, and it could have a noticeable affect on search engine results.

What does this mean for your site?

In essence, it means that you should follow the eternal principles of good online content: be relevant, interesting or timely. Reworded, rehashed or simply copied content won’t work. Create content that matches the quality of your website design.

I wonder if we’ll all notice a big jump in the quality of our search results?