As making use of RSS feeds becomes more and more popular, most webmasters are clueless about the role of RSS feeds in their search engine marketing strategy. This article discusses two use cases of RSS feeds: publishing syndicated content in the context of search engines filtering duplicate content, and a creative, but labour intensive way to build a stable and targeted stream of traffic using RSS feeds.


Index

Can Syndicated RSS Content Improve Search Engine Rankings?

Search engine crawlers tend to fall in love with often updated pages, but rotating foreign content is a double-edged sword when it comes to search engine rankings [Shrink]

Duplicate Content Filtering and Search Engine Penalties

What is a duplicate content penalty and how does a duplicate content filter work? [Shrink]

Do SEs Apply 'Duplicate Content Penalties' to Web Sites Publishing Sydicated RSS Feeds?

Sydicating RSS feeds is a great way to feed a web site with fresh content, but is it safe with regard to search engines filtering duplicate content? [Shrink]

How to Use RSS Feeds to Build Stable and Ever-growing Traffic Streams

A long term strategy to build diverse and stable traffic streams using high quality RSS feeds [Shrink]



Can Syndicated RSS Content Improve Search Engine Rankings?

[Shrink]

Will fresh content pulled from RSS feeds improve your search engine ranking?

1. No, at least not significantly. RSS content is duplicate content by definition. Every text snippet imported from RSS feeds can be found on other web sites too. Chances are other sites carrying the same RSS content will almost always outrank you. Optimizing on-page factors is pretty tricky, if the page is dynamic (pulling its content from RSS feeds).

However, you will see some traffic from search engine users using keywords or quoted phrases found in RSS feeds you've syndicated. Most RSS feeds phase out older content, and your page reflects these changes, if you don't cache the RSS feeds forever. So most probably a (scanning) search engine user landing on your page will not find the content s/he is interested in. You'll find those visits summarized in the '0s-30s' column of your 'visits duration' stats.

Dilution of keyword density is another factor you must take into account. Simplified, search engines determine a page's topic by keywords (along with other on-the-page factors) and weighted inbound links (off-the-page factors). If you don't publish filtered content from various RSS feeds to balance a page's keyword density, your pages lose their initial topic sooner or later, which makes the efforts put into optimizing of off-the-page factors obsolete.

2. Yes, but indirectly and not to a great degree. If your site provides an outstanding surfing experience to your visitors, and fresh on-topic news pulled from carefully selected RSS feeds do improve a site's value, some of your recurring visitors will link to your site. Non-reciprocal inbound links do help with search engine ranking. Almost unknown, visitors bookmarking your pages can improve search engine rankings too (example).



Duplicate Content Filtering and Search Engine Penalties

[Shrink]

What is a duplicate content penalty?

Hardcore spammers make use of scripts to produce zillions of fraudulent doorway pages made up by slight variations of sentences and text snippets pulled from a database. Each of these doorway pages is optimized for a particular keyword phrase, but the text content visible to search engine users is duplicated over and over. Search engines consider this approach to trick their users into viewing useless pages, and similar fraudulent techniques as well, spam and penalize the spammers by banning their domains.

Although only unethical webmasters have to live in fear of duplicate content penalties, this term is often used as a synonym for duplicate content filters. Filtering duplicate content stands for a set of methods search engines use to optimize their search results in the best interest of their users.

What is a duplicate content filter and how does it work?

When a search engine crawler fetches a page and finds out, that it's an exact duplicate of another page in the search engine's index, it changes the 'lastCrawled' attribute if the URL matches and moves on. If the URL doesn't match, the fetched page gets trashed. Comparing pages, the crawler performs a heuristic method for performance reasons, thus sometimes very similar, but not identical pages may be discarded 'by mistake'.

Otherwise the crawler puts the fetched page in a queue for indexing. During the indexing process some high sophisticated algorithms extract the text content from templates, navigation and advertising. Now the search engine has two versions of the page, a 'full' version and a 'core text' version. Further calculations are applied to both versions, and under some circumstances the indexing process may discard the page.

If the page makes it into the search engine's index, real time filters are used to avoid duplicates in the context of the user's search query. Say a user searches for a press release, it makes no sense to deliver all reprints on the SERP. However, it happens all the time that a search engine puts reprints of an article or press release from different places on the SERP. Analyzing these obvious duplicates, you'll find out how much surrounding text and different navigation it needs to prevent a page from being catched in a duplicate content filter.


This description of procedures used by search engines to filter duplicate content is extremely simplified.



Do SEs Apply 'Duplicate Content Penalties' to Web Sites Publishing Sydicated RSS Feeds?

[Shrink]

Is it safe to publish RSS content?

Yes. You can even make a seperate page per feed in addition to (aggregated) headline syndication on index pages. That's good for your users (readability, if you use a larger font) and does not trigger a search engine penalty. As long as you syndicate related RSS feeds which are of interest for your users, you can pretty much do what you want.

On the other hand, if you have a large network of those pages scraping content from other sources, plastered with ads and no other unique content, probably search engines will do exciting things with your stuff in their index.

There is no 'duplicate content penalty' applied to 'clean' web sites. It's just that search engines filtering duplicates guess the source, and deliver that page on the SERPs. Unfortunately, sometimes the guess is weird, but that's life.

It is always a good idea to filter themes (for example by the category tag) out of several (cached) RSS feeds, creating one-topic pages which catch the audience much longer than generic pages.

Carefully selected RSS feeds properly used as offsite content and relevant to your site, or a section of your site, are safe with search engines, and most probably a great service for your vistors. What's good for users usually doesn't trigger penalties. However, it's not the method of 1st choice to improve indexing or ranking on the SERPs.



How to Use RSS Feeds to Build Stable and Ever-growing Traffic Streams

[Shrink]

Write summaries of your new articles, and serve them in a RSS feed with links to your articles. Ensure you use different wording, don't copy text snippets from your site to prevent the uniqueness of your content. Do not create full RSS feeds, protect your content instead. It does not help you when you spread your unique text content all over the internet, but it helps others when you thoughtlessly give away your stuff for free.

Validate your RSS feed before you submit it. It helps to provide a well maintained RSS feed, frequent updates do make it more attractive for potential subscribers and editors of RSS resource sites as well.

Go out and search for related web sites which make use of foreign RSS feeds. Email the webmasters and offer your feed. If you've high quality content, they probably will include your feed. Even headline syndication is a great way to build stable and targeted traffic streams.

You should submit your feed to every RSS resource site out there too. Start with this list of RSS feed submit pages, then surf on. Ensure Yahoo! is aware of your RSS feed by adding it to your My Yahoo! home page, also you should make use of Yahoo's add-feed button on your pages.

Then be patient and keep the good content quality of your feed. After a few weeks you'll get the first deep inbound links along with a little traffic. A steadily growing amount of deep inbound links improves your search engine rankings to a great degree. Your content pages, ranking well for lots of two and three word keyword phrases on the SERPs of all major search engines, will bring in more and better targeted traffic than home page links, which many webmasters still prefer.

That's just one of many methods as part of a long term strategy to build diverse and stable traffic streams. Don't rely on this method alone. Don't put all your eggs in one basket.



Author: Sebastian
Last Update: Monday, June 27, 2005   Web Feed

· Home

· Internet

· RSS+SEO

· Web Links

· Link to us

· Contact

· What's new

· Site map

· Get Help


Most popular:

· Site Feeds

· Database Design Guide

· Google Sitemaps

· smartDataPump

· Spider Support

· How To Link Properly


Free Tools:

· Sitemap Validator

· Simple Sitemaps

· Spider Spoofer

· Ad & Click Tracking



Search Google
Web Site

Add to My Yahoo!
Syndicate our Content via RSS FeedSyndicate our Content via RSS Feed



To eliminate unwanted email from ALL sources use SpamArrest!





neatCMS

neat CMS:
Smart Web Publishing



Text Link Ads

Banners don't work anymore. Buy and sell targeted traffic via text links:
Monetize Your Website
Buy Relevant Traffic
text-link-ads.com


[Editor's notes on
buying and selling links
]






Digg this · Add to del.icio.us · Add to Furl · We Can Help You!




Home · Categories · Articles & Tutorials · Syndicated News, Blogs & Knowledge Bases · Web Log Archives


Top of page

No Ads


Copyright © 2004, 2005 by Smart IT Consulting · Reprinting except quotes along with a link to this site is prohibited · Contact · Privacy