Duplicate Content Filters to Destroy Article Marketing? In a Word, No
Copyright © February 18, 2006 by Mike Banks Valentine
Rumors are circulating that recent search engine updates are threatening to penalize web sites carrying articles by niche industry experts due to wide distribution and use of those articles on multiple sites. Not so.
Duplicate content filtering confuses everyone. It is absolutely not new and has been in effect for years, but is constantly refined in search engine algorithms to filter out abuses. Any suggestion that article marketing is targeted by the search engines as duplicate content is an understandable misunderstanding. Duplicate content filters look for abuses, not legitimate multiple uses in appropriate forums.
Duplicate content filters were first employed when people began setting up precisely mirrored domains without variation on multiple domain names to increase visibility. That ridiculous method worked to increase ranking until the search engines began de-listing one of the duplicate sites of those employing this technique. Usually it was the older domain that stayed in the index and the newer mirrored site was de-listed.
About the same time, unethical thieves began outright stealing entire sites and placing them on new domains to rank equally as well as the original owner for competitive phrases. Once the traffic was there, they sent them to their own product or affiliate pages. That worked for awhile, but the duplicate content filters nixed that as well and protected the orginal site in rankings.
Then sites began putting up "landing pages" and "doorway pages" for SEO purposes with minor keyword variations in headlines and body text on multiple pages on one site with very closely related text with minor keyword swaps to rank well for blue widgets, red widgets, purple widgets. No text varied but the color or brand or, in the case of travel sites, city and resort names. So search engines extended the duplicate content filter to include that ruse and filter it out.
Continually refining these duplicate content filters is an ongoing effort meant only to beat search engine sp*mmers. Search engines don't set about penalizing legitimate uses of duplicate content - such as press releases distribution and reproduced articles by experts on specialized topics used widely on niche sites and blogs.
There are dozens of legitimate reasons to have the same article on multiple specialty sites and even some good reasons within a single domain. Blogging software actually creates a duplicate page for every post which is deposited in an archive. That blog contains duplicate content until each post rolls off the bottom of the main page. AP and Reuters news stories run on hundreds of news sites. Experts, pundits and commentators within niche industries legitimately syndicate their content to appear widely across dozens of niche sites within their industry.
Many sites now put up duplicate "printer friendly" versions of pages without penalty, but it's always a good idea on the same domain name to post robots metatags telling them not to index duplicate pages. Printing pages or variations on landing pages used for pay-per-click (PPC) advertising should each be tagged by < meta name="robots" content="noindex,nofollow"> so you needn't worry about being penalized.
Articles distributed for use by other sites appear on many sites with surrounding themed content, varied site navigation and differing internal links. Articles rank well if they match the theme of the site they are used on. The best ranked sites usually rank better for that article. There is currently no penalty for using articles which appear on several sites. If this were the case, hundreds of major industry portals would be severely penalized.
If you search for article titles in quotes, you'll see them repeated everywhere across the web. Try a search for "Blogging Chocolate Purses" and see the extensive use of that article. I first posted it on my blog and my blog post ranks just below a major search engine portal for that article title. No penalty there, Pandia.com is just better ranked overall than my blog and they are legitimately using that article with my permission.
Article marketing is something I recommend to ALL SEO clients to gain valuable one-way inbound links. How much better is an article - with 700 to 1200 words displaying your expertise than a so-called "reciprocal link" gained by begging for it by sp*mming, er I mean, sending mass unsolicited emails to unrelated sites? (I'm stunned that anyone still uses that technique as it seems to me to be the equivalent of begging for links on street corners.)
It is inconceivable that experts writing on specialized topics will ever be penalized by search engines because many niche sites reproduce their expert advice & commentary in newsletters, web sites and blogs. Search engines would face an insurmountable problem in flitering legitimate expertise and commentary simply because it is popular and made available for use on multiple industry blogs and niche sites
Your articles are no less valuable to the web community because they are syndicated and that appreciation is displayed clearly when they are used extensively across multiple web sites. Write on article marketers.
Banks Valentine blogs on Search Engine developments
and can be contacted for ethical SEO work at: http://www.seoptimism.com/SEO_Contact.htm
Please see some ads as well as other content from TranslationDirectory.com: