Manage Learn to apply best practices and optimize your operations.

Measuring the impact of Data Protection Manager 2006

Microsoft System Center Data Protection Manager 2006 optimizes disk-based backup and recovery for Windows Server 2003, but not without performance penalties. Admins will want to measure the load DPM imposes on system resources.

Microsoft System Center Data Protection Manager (DPM) 2006 is software that optimizes disk-based backup and recovery for Windows Server 2003 and related products. It reduces the number of backup windows and provides instant file recovery and other sophisticated data protection services, but, in so doing, it uses resources and imposes performance penalties on protected servers.

DPM uses a combination of logging, replication and snapshots to keep track of protected data and to make copies of it at user-defined intervals. Whenever data in a protected file is changed, the changes are logged on the protected server. That information is forwarded to the Data Protection Manager server over the network on a regular schedule and stored on the DPM. The data is used to synchronize the replica of the protected file on the DPM server and, at regular intervals, a snapshot of the replica is made using Microsoft's Visual SourceSafe services.

Obviously, this process imposes a load on system resources, including the protected server (logging), the network (synchronization) and, of course, the DPM server and associated storage. But how much of a load?

The Microsoft System Center Data Protection Manager Planning and Deployment Guide describes several techniques that can help you estimate these impacts.

The first step in calculating the load is to estimate how quickly the protected data changes each day. For a quick estimate, look at the size of your average daily incremental backup. This method is fast, but it isn't entirely accurate because of the way DPM works.

You can get a more accurate estimate by looking at the characteristics of the data you are protecting. Because DPM works at the byte level rather than the file level, the actual rate of data change may be a good deal lower than your incremental backup suggests. On the other hand, if you have a lot of small files that get overwritten during the day, your rate of change may be higher because each one of those changes will be captured by the Microsoft System Center Data Protection Manager log on the application server.

Microsoft recommends assuming a data change rate 1.5 to 2 times the rate suggested by the daily incremental backup.

Microsoft System Center Data Protection Manager Fast Guide

 Verifying Data Protection Manager status
  Server plays key role in Data Protection Manager deployment
 Why Data Protection Manager is replacing disk-to-tape backups
 Troubleshooting Data Protection Manager warning message
 Measuring the impact of Data Protection Manager 2006
 Storage pool savvy helps admins optimize Data Protection Manager
 Watch for unsupported data types in Data Protection Manager
 Protecting servers with Data Protection Manager

About the Author:
Rick Cook has been writing about mass storage since the days when the term meant an 80K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last 20 years he has been a freelance writer specializing in issues related to storage and storage management.

Dig Deeper on Windows Server storage management

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.