There are a few interesting myths and misconceptions out there about PCs and defragmentation. The first myth is that NTFS drives don't need to be defragmented. The second is that "modern" operating systems don't need to be defragmented at all. The third is that simple defragmentation on its own keeps your PC working fast.
The last one is the easiest to deal with. Recently I converted my brother's old Pentium II machine running WindowsXP into a "backup server". The WDD defrag program had been run occasionally, and the drive was pretty full, so most files were not too seriously fragmented.
As you can see from the "before" picture, the green files are all defragmented, but not particularly well organised. This was partly because I had just uninstalled a whole load of games and other junk from what was at one time a family PC.
Then I set to work on the drive using JkDefrag, allowing it to do its standard default defrag. Here is the result:
Given the choice between the two drive layouts, which one do you think will work more efficiently? The answer is pretty obvious. It was easily demonstrated on the PC in question because of its low processing horsepower in the first place. By the time I had defragmented the drive, cleaned the registry with CCleaner, and then compressed it with NTREGOPT, the machine was starting to "run" instead of "walk".
Defragmentation isn't the only strategy to make a PC run faster, but it is one of the strategies that are useful. "Regular" defragmentation is better than "random" defragging, but how often is "regular"? It depends on your PC. If you are generating lots of files, or editing databases, a daily defrag may be required. If your files are pretty small, a monthly defrag is probably good enough.
My own rule of thumb: if the defrag takes longer than an hour you need to do it more often. If it takes less than 10 minutes, defrag less often. You don't want to waste a lot of time in order to save a little time. Do the defrag during a quiet period, such as when you're away from your desk for a meeting, or after hours. In that way you don't waste productive time.
The other myths take longer to explain, but every drive on the planet runs the risk of fragmented files. Consider the following scenario: I install a 10GB drive on a server, and connect 2 PCs to the server. User A saves a 4GB data file to the server. The next day user B saves a 3GB file. So the drive has 3GB free. Now user A modifies his data file so it grows to 4.1GB. Even if he deletes the old file before saving the new one, there are only 2 gaps of 3GB each on the drive, unless the server moved the files around, which it isn't likely to do. So the 4.1GB file will be fragmented. Now it's up to the server to sort out the fragmentation. Different operating systems will do this in different ways, but every OS has to have a solution, even Linux.
1 comment:
Another common claim is that fragmentation does not affect todays 'large' high speed HDDs.The fact is they too get fragmented as fast or slow depending upon usage. A fragmented drive naturally takes more time to seek the file fragments, which explains the slowdown or lags or even freezes amidst programs. If its not addressed, it can very well affect the productivity of systems.
Post a Comment