Sunday, September 30, 2007

The Great Defrag Shootout Testing Methods

My efforts at testing all those defrag utilities got mentioned in the Code Project forums. One question I found quite interesting:
John Cardinal wrote: I am completely mystified as to how he can compare performance of defrag utilities equally. I was surprised to see that aspect listed and as far as I can see he doesn't give any methodology of testing.
So I responded in order to explain. The full response is copied here:

John raises a question that every programmer should ask, and it is most valid. I don't have full testing facilities and so no attempt was made to test whether program A did a defrag that resulted in faster hard drive performance than program B, but my testing method ended up being quite thorough anyway. Allow me to explain.
I'm a database programmer, using Access 97 and Microsoft SQL Server 2000. In a typical week I copy a complete SQL backup file (6GB "SAClinic.dbk") from the production server to my laptop, and save it in a compressed folder on drive D:. I then attempt to defragment it so that I can do a SQL Database Restore from the data file.
The SQL Database is stored as c:\sql\mssql\data\SAClinic_Data.mdf and a corresponding log file. Again, because these files are large, they get stored on my laptop drive as compressed files.
Compressed files of this size fragment easily and in a gazillion fragments, and most of the defrag programs choked on the number of fragments and/or the remaining disk space. I mentioned such problems in each review in "The Great Defrag Shootout". I also used contig.exe to analyse files or folders and report the number of fragments in the files.
The SQL Database Restore operation doesn't work properly if the files are too fragmented, so I was at the mercy of this software (and the defrag program being tested) whenever I needed to perform this operation.
In addition, I download and edit audio books and podcasts, so my "Audio Books" folder would get fragmented over the course of a few days. Again, it was up to the defrag program being tested to fix up this mess.
Finally, I keep all the source code for my main programming project in a 4GB encrypted volume maintained by TrueCrypt, and so was able to determine if the defrag program being tested could recognise and defrag this volume as well.
I was quite shocked at how many commercial defrag utilities were unable to cope with the fragmented files on my hard drive. Diskeeper, the most expensive utility tested, failed miserably, and took over 20 minutes just to analyse the drive. Incidentally, it was another aspect of DK (its inability to deal with drives that get too full) that led me to look for a better defrag program in the first place.
So I wasn't comparing "performance" of defrag utilities in the conventional sense, but it was more a question of "usability" (was it easy to set up and use) and "capability" (was it able to do the job). I only gave a "thumbs up" to defrag utilities that managed to keep my laptop drive defragmented during the course of my normal working week.
During the course of the testing some utilities were uninstalled and remained so; others were installed and retained because they remained useful. My laptop now has 5 programs installed: the built-in Windows Disk Defragment program, SysInternals contig (I still use it on those big SQL data files from time to time), SysInternals PageDefrag (boot time defrag of system files), JkDefrag and PerfectDisk 8.
PerfectDisk is my "program of last resort" and I call on it to sort out fragmented metadata and other tricky data. It has never let me down. I use the JkDefrag screen saver to keep my drive neat and tidy. That's how these two ended up being the "winners". I kept them installed because they are the most useful of the lot.
I hope this clarifies the method I used. I don't claim to be a professional tester, just a user with some very demanding defrag requirements.

No comments: