Scrapers and the Duplicate Issue

by Gary on August 7, 2008 · 11 comments

in SEO

We’ve all been warned again and again about the problems that duplicate can bring. However, the truth is that nowadays it is not really such a big deal anymore. This doesn’t mean that you should start plagiarising other people’s stuff and simply offer duplicate content on your site. What this does mean is that you shouldn’t stress out whenever scrapers decide to “syndicate” your content for you. The reason why I say this is that Google said so!

According to Sven Naumann “you shouldn’t be very concerned about seeing negative effects on your site’s presence on Google if you notice someone scraping your content” because Google is pretty good at determining which site is the original publisher of the content. Your site will NOT be penalised for the scraper’s work. Duplicate content WILL NOT lead to lower rankings for your website. The worst it can do for you is to have that section containing the duplicate content filtered out (meaning it wouldn’t affect your rank negatively nor positively). If you are recognised as the original publisher of the content though this wouldn’t even happen. To make sure that you get credit for the original work, make sure you only syndicate to websites that not only will give you a byline but will also link to your site. As for the scrapers don’t even think about them, again, Google says they’re good at figuring out which is which.

Share This Post