Does deduplication change the way we back up and restore data?
Data deduplication reduces the sheer amount of storage space required for backups, and this can benefit data protection by allowing faster and more frequent backups, faster restorations, and potentially longer retention times within the limits of regulatory compliance requirements and corporate policy.
Deduplication can affect the actual backup application, depending on its approach. For example, backup tools that handle block storage should work as is because deduplicated data is preserved on the target storage device. By comparison, backup tools based on file storage may generally "undo" the deduplication -- resulting in significantly more storage being needed on the target storage device -- unless the backup tool specifically supports Windows Server 2012 R2 data deduplication. For example, a tool like Windows Server Backup fully supports deduplication, and IT administrators can restore a complete volume or individual folders from the backup.
Remember that deduplication doesn't operate on system and boot volumes, remote drives, encrypted files (because data is already uniquely scrambled), or files smaller than 32 KB. This content is backed up and restored just as any conventional file.
Deduplication periodically runs so-called garbage collection processes to recover storage chunks no longer in use. It's best to run a backup after a garbage collection process to ensure that any changes to freed storage are captured in the backup and not allowed to age out.
Data deduplication provides an important means of improving storage efficiency, lowering storage costs and speeding data protection processes. But deduplication's effectiveness and performance can vary depending on the workload and deduplication setup. IT administrators should invest the time it takes to benchmark each storage volume before and after deduplication is applied, in order to gauge any performance penalty, then should adjust scheduling and other options to optimize server and workload performance. Backup and restoration processes should also be tested in advance to understand the storage needs of deduplicated data and to allow for updates or patches to the data protection tool to enhance storage use for deduplicated backups.
Related Q&A from Stephen J. Bigelow
Blade servers come in a variety of configurations. In order to effectively manage your data center, you'll want to consider your storage needs, blade... Continue Reading
When properly implemented -- and understood -- a cloud migration factory combines the right mix of people, processes and tools to smoothly transition... Continue Reading
Scaling up or scaling out is not a decision to be made lightly. Use monitoring data to determine the strategy that fits your use case and to inform ... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.