Thursday, November 22, 2007

Benchmarking Data Files

It has been relatively easy to measure the load times of system and application files using my "Prefetch File Processor" program, and the initial results are quite interesting, especially the performance of WDD itself.
But I get the feeling that I'm only measuring a part of the total user experience. After all, we use applications to do work, and the work component isn't being measured at the moment. The data types that I'd like to benchmark are:
  • Podcast files: these are usually MP3 files, with several being downloaded at once, and each feed stored in a different folder, like iTunes;
  • email folders: these grow by small amounts each day;
  • Office documents: these are stored in multiple folders, and not all of them are current;
  • Pictures/music: varying sizes, and many files in a single folder;
  • Folders: a multi-level folder structure with varying numbers of files in each folder; and
  • Database files: grow with use, and need to be repaired and compacted from time to time.
What other file categories have I missed? Do I create programs that constantly edit and make changes and run while the defrag program is trying its best, or do I create a program that generates the files or makes changes to them, and then stops to give the defrag program a chance? Your thoughts are welcome, and I'll start developing something when my workload subsides a bit further.
Update: after a really useful chat with the guys on the Wilders Security Forums, I have decided to use the BootVis trace utility in the next round of testing.

No comments: