After my three-part discussion on fragmentation and disk defragmentation appeared on this site, the emails from site members came pouring in. Most of them agreed with

    Requires Free Membership to View

my conclusions. . .up to a point. Others politely (and some not so politely) disagreed with specific points I made.

A vast wealth of material resulted from these discussions, so I created this fourth installment in the series to address the key issues readers brought up, and to refine (and revise) some of my earlier findings. I still feel that most of my original points are valid, but it helps to see them in a larger context.

1. Workstations and servers have radically different disk defragmentation needs.

Several readers pointed this out to me, and it makes sense. Some servers—especially database servers and file servers—allocate and re-allocate space much more aggressively than a desktop system. My original one-size-fits-all discussion didn't take into account the fact that a highly trafficked file server will have a markedly different fragmentation profile than a desktop PC.

For most machines with very busy file systems, a third-party defragmenter that works progressively in the background or at scheduled intervals (for instance, during off-peak hours) will be a boon. Aside from the major defrag applications, I've looked into solutions such as programmer Leroy Dissinger's defrag utility Buzzsaw. This tool monitors one or more hard disk drives and defragments individual files whenever disk and CPU usage drop below a certain point. I've used it on a server that runs a database-supported Web site and have gotten very good results with it.

2. NTFS has measures to alleviate fragmentation, but they're far from perfect; disk defragmentation is still needed.

There's no question that fragmentation occurs on NTFS volumes, and it's good to defragment an NTFS volume periodically. However, the measures that NTFS takes to alleviate fragmentation are not perfect. Having said that, it makes sense to do this only when it's not at the expense of existing system performance—i.e., once a week for a workstation, more regularly for highly trafficked servers (albeit on a schedule where the defrag takes place during off-peak hours).

One problem that some readers pointed out is that NTFS doesn't guard well against free-space fragmentation. In fact, one person argued that the way NTFS allocates free space can actually make things worse.

But there's no consensus on what to do about it. One approach is to move all available free space into one contiguous block whenever possible. Unfortunately, this approach is defeated instantly by the way NTFS allocates free space.

A newly created file gets placed in the first available block of free space on an NTFS partition. The size of that free-space block needs to be big enough both to hold the file and some overage, since the file will likely change sizes in the future. But once the file is written and closed, any space remaining in that block is then returned to the drive's pool of available free space. If multiple files get written in this fashion, a single consecutive block of free space can get turned into a series of fragmented free-space blocks.

To that end, it doesn't make much sense to aggressively defragment all free space into one big pool. Better to make it available in a number of reasonably sized (perhaps 64MB or so) pools. Microsoft concurs on this point in its documentation for the defrag tool in Windows 2000, saying that the effort involved to push all the free space together would negate any possible performance benefit.

I would argue that fragmented free space really becomes critical only when free space on a hard disk drive becomes extremely low—i.e., when the only space available is badly fragmented free space, and the system is forced to create new files in a highly fragmented fashion. But on a large enough drive, where the free space isn't allowed to go below 30%, this should almost never be an issue. There may still be fragmentation of free space, but large enough blocks of free space will almost certainly always exist somewhere on the drive to ensure that files can be moved or newly allocated without trouble.

I mention the 64MB figure as an adjunct to something I saw in the defrag utility now bundled with Windows Vista. By default, this utility will only attempt to consolidate fragments smaller than 64MB. My guess is that a fragment larger than 64MB is not going to impose as much of an overhead. Even if you have a very large file (a gig or more) broken into 64MB fragments, it won't matter as much because you're never reading more than a certain amount from the file at any given time. That is, unless you're accessing several such files at once, in which case the impact of fragmentation will take a back seat to the mere fact that you're reading multiple physical files from different parts of the disk.

3. Third-party disk defragmentation programs have more robust feature sets than the native Windows defragmenter.

This is undeniable. The native Windows defragmenting tool has only a fraction of the features offered by many of the commercial defragmention products, such as the ability to defragment system files after a reboot or to defragment the master file table (MFT) space.

But do these additional features justify themselves? That tends to be a factor of how aggressively the file system needs defragmenting in the ways that only a more advanced defragger can provide. For instance, on a file system where the MFT isn't trafficked as heavily (i.e., there aren't hundreds of thousands of files being created) and not at as great a risk to be fragmented, MFT disk defragmentation won't be as useful, since the MFT isn't going to be fragmented much to begin with. On an extremely busy server, this would be more useful; on a workstation with a lighter file-creation load, less so.

While the MFT zone can be defragmented offline, it cannot be shrunk or resized, and no third-party tool can do this. The only way to resize the MFT if it's been expanded is to copy all the files off a volume, format it, and copy them back on again. On the other hand, NTFS re-uses space within the MFT itself if there's no other free space to be had, and the MFT should almost never grow to be a sizable percentage of the drive's space. A tool like NTFSInfo from Sysinternals can give you details about the MFT zone on a given drive. If for some reason it's become an abnormally large percentage of the drive's space that might be a sign of something else being wrong.

There are other issues that I researched but came to no firm conclusions on. One, which is tangential to defragmenting free space, is the file placement issue.Windows XP, by default, analyzes file usage and tries to optimize access patterns to the most commonly used files in the system every three days. Some defragmenters (for instance, PerfectDisk from Raxco Software) work with this information to further optimize file access patterns, but it's not clear if this really does produce a performance improvement that's lasting and quantifiable.

(Incidentally, defragmenting the page file or Registry can be done without having to buy a separate application. For instance, the free tool PageDefrag, is perfect for this sort of work.)

In conclusion, I want to emphasize and clarify three things that might have gotten lost in my discussion.

Fragmentation still exists and is a problem. I was not dismissing its impact wholesale. But its impact has been alleviated by advances in hard disk drive technology, operating system design and file system design, and its impact will continue to be reduced (but not eliminated) by further improvements in all of the above.

It's still a good idea to defragment regularly, but there's little point in doing it obsessively when the real-world benefits might not be measurable in any reliable way. More than once a week for a workstation seems to cross the point of diminishing returns (although there are exceptions, which I'll go into). But the investment of time and system resources required to defrag once a day doesn't pay off except in the most incremental and difficult-to-assess fashion. (One exception to this would be programs that defragment progressively and "silently," like the aforementioned Buzzsaw, which usually run when the system is idle.)

You should balance the act of defragmenting against other ameliorative actions that could be taken, such as buying a larger or faster hard disk drive or adding more memory. Drives are cheaper and larger than ever. Memory is cheaper than ever, too. Adding more memory or upgrading to a faster, higher-capacity hard disk drive will almost always yield a better performance improvement than anything you can do through software.


Disk Defragmentation Fast Guide

 Introduction
 Disk defragmentation: Performance-sapper or best practice?
 New hard disk drives reduce need for disk defragmentation
 Four steps to lessen the effect of fragmentation.
 Flash memory drive defragmentation: Does it make sense?
  Three disk defragmentation issues defined

About the author: Serdar Yegulalp is editor of the  Windows Power Users Newsletter, which is devoted to hints, tips, tricks, news and goodies for Windows NT, Windows 2000 and Windows XP users and administrators. He has more than 10 years of Windows experience under his belt, and contributes regularly to SearchWinComputing.com, SearchWindowsSecurity.com and SearchSQLServer.com.

This was first published in November 2006

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.