Information on Duplicate Content Issues with CMS & Blogs
Please note: Since first writing this article back in Jan 2007, Google has made its stance on duplicate content abundantly clear. Click here to see what Google says about it.
If you have not yet read page one of this article ‘Duplicate
Web Site Content - One way to harm your site’s
rankings’, please click
here to view it.
I've already outlined the generally recommended percentages
a page needs so as not to fall foul of the dreaded duplicate
content filter, and of the folly of reproducing other people's copyrighted materials
from industry publications.
Let’s take a look at two other areas where misconceptions
often occur.
Duplicate Content with Content Management Systems and Blogs
My site is managed by a CMS and there is a
certain amount of repetition from page to page. Is
this a cause for concern?
Under normal circumstances, the amount of duplication
caused by a CMS (Content Management System) is minimal
(normally about 10% to 15%). Here, the only cause for concern should be whether each
individual page offers ‘value’ to its visitors
in the shape of plenty of unique, informative content.
Once again, as long as you ensure that 75% of your content
is original, you should have nothing to worry about,
aside from the fact that many Content Management Systems are not what you might
call ideal for web promotion purposes. The only other time there might be an issue is when
product descriptions enter the equation.
The problem with these is that most webmasters will simply use the
manufacturer’s original description, which of
course has been used on countless other web sites before,
and will be flagged as duplicate content. To minimize this problem, it really does pay to write your own
product descriptions wherever possible.
Surely the way most blog sites store and serve
data must cause duplicate web site content issues? Although on the face of it, blogs might be
perceived as prime targets for the old duplicate content
filter because of the myriad ways in which they allow
visitors to search for posts (tags, archives, top posts,
etc.) and the different URLs under which content is
displayed depending on the searcher’s approach,
there seems to be little in the way of duplication penalties.
Then again, since the vast majority of blogs have a
fairly similar way of doing things, you would expect
search engines to have made allowances for this. This is one area where trouble may well be looming on
the horizon, as blog spamming will result in more and
more abuses of this particular medium.
Can it hurt me?
The simple fact is that duplicate content can and will
harm a site’s rankings, and regaining lost search
engine rankings is usually a slow and painful process.
Once again, click here to find out how Google sees it these days (Nov 2009).
So; whilst spammers are still looking for ever more imaginative
ways of beating the latest search engine algorithms, duplicate
content filters and assorted other means of trying to
keep search engine results as relevant as possible, it
is still far more beneficial all round to keep your site
design and web promotion firmly aimed at you human audience.