Creating a Generic Site-To-Rss Tool


I'll show how to use regular expressions to parse a Web page's HTML text into manageable chunks of data. That data will be converted and written as an RSS feed for the whole world to consume. Finally, I'll show how to create a generic tool that enables you to automatically generate an RSS feed from any website, given a small group of parameters. At the end of the day we will have a working RSS feed for


Ah, the joys of RSS. You can get the data you need, as soon as it's available, and no nagging browsers or pop-ups along the way. If only all sites had RSS feeds. If there's one thing that would be really nice, it would be the ability to generate an RSS feed from any site I want. For example, .netWire is a very interesting site with lots of useful information. However, the folks maintaining this site hadn't thought about providing it with an RSS feed, which it so sorely needs.

So I got to thinking: “All the data on the site that's important to me seems to be arranged in an orderly and predictable manner. I should be able to parse it in a fairly easy manner and make it into an RSS feed.” So I started trying. It worked out pretty well. So well that I've come up with a way to let you do your own site scraping using a generic tool, providing it with only simple rules expressed as a single regular expression.

Planning a Site Scrape

“Site scraping” depicts going over a site's HTML and mining it for any relevant data. All other text is discarded. For this article, I've chosen .netWire as the site I'll be scraping, since the outcome of this will be useful to a great many people. In planning the scraping I'll ignore the specifics of how I actually get the text to parse and leave that topic for the end of the article.

The first thing I did was open my Web browser on the .netWire site, right-click and select “view source.” Notepad shows me the site as my future parser sees it. This raw text is the juice I need to parse in order to get the data I need. To be honest, it looks quite scary. How on earth am I going to come up with an easy way to parse such an enormous amount of information without losing my head? Scrolling through the text, however, I could start to see patterns in which important text, text that was relevant to me, appeared.

There were links inside paragraphs, followed by SPANs and many more attributes. It was a nightmare to parse. Just writing all the rules in searching for a specific link or title for the RSS feed that I wanted to create was a hard enough, but I also had lots more with which to contend. I had to find text inside of where I found text inside of where I found text. It was hardly a job for a few hours on the weekend. So the next thing I decided to check was whether I could do the job with regular expressions.

Regex–A Powerful Scraping Tool

If you don't know what regular expressions are, there are loads of articles on the subject. You need to understand them before reading how to use them for scraping a site.

Regular expressions enable us to easily extract necessary information from text. It enables us, through complex expressions provided as plain text, to recover strings that match lots and lots of rules provided by us. The data we receive back after running our expressions on a string can be as complex and as detailed as we'd like. We can even divide it into groups of text that are matched, along with group names attached to them, enabling us to easily program against the regular expression (Regex) interface (see “Practical Parsing Using Groups” for more info).

Since a site is ultimately represented as plain text (be it HTML, JavaScript, or anything else), we can apply regular expressions to that text as well, enabling us to search and filter any irrelevant information quickly and easily.

You might also like...


About the author

Roy Osherove Israel

Roy Osherove has spent the past 6+ years developing data driven applications for various companies in Israel. He's acquired several MCP titles, written a number of articles on various .NET topic...

Interested in writing for us? Find out more.


Why not write for us? Or you could submit an event or a user group in your area. Alternatively just tell us what you think!

Our tools

We've got automatic conversion tools to convert C# to VB.NET, VB.NET to C#. Also you can compress javascript and compress css and generate sql connection strings.

“The greatest performance improvement of all is when a system goes from not-working to working.” - John Ousterhout