Keep in mind, datareaders increase in speed also means it has reduced functionality, but if you dont need certain features you get from a dataset, then no loss.
As far as speed, fastest to slowest:
1. ram
2. database server (even if its running on a separate box)
3. xml file
Keep in mind, storing a XML file and reading it into ram is slow that querying the database, because you have to do #3 and then do #1. I once benchmark compared a system I wrote (before .NET) that would read .HTML templates off the harddrive, and when I put the HTML text inside a database table, I could read the database 8x faster than a .HTML file off the HD. Of course 8x faster than microscopic is still microscopic, I didnt use the "template in a database" code because it was seriously more complex to alter a template inside a column vs simply editing a .HTML file on the drive. It would only be an acceptable trade-off when youre dealing with millions of hits (Amazon, eBay).
As a beginner, dont always go the fastest route, keep things good and flexible, dont forget the
KISS rule.
Also, if you read into Yahoos research, only 5% of the total time it takes a webpage to draw is spent on your server code, the other 95% is spent on parsing your HTML/JS/CSS so spend your time outputting great HTML if you want speed improvements. Explaining that is far beyond the scope of this thread.