Dealing with data is one of the most painful parts of a Windows administrator's job. Even the smallest IT environments easily support thousands of individual files, and each and every one is important to somebody.
For the sum total of IT's history, we administrators have been responsible for just two aspects of file administration -- security and backup. Making sure that the right people access the proper files is important to security, while protecting our users against their own accidental deletion is important to their sanity. But today's ever-growing landscape of regulation and sensitive information now forces us to look at more than just the object that is a Word file or an Excel spreadsheet. We also now find ourselves additionally responsible for managing the contents of those objects as well.
As if watching over ten thousand permissions wasn't enough, the rules of sensitive information and regulatory compliance now require us to properly handle the different kinds of data that is stored on our networks.
Thankfully, Microsoft also understands this new responsibility. Windows Server 2008 R2 offers new technologies that introduce the concepts of content administration. R2's File Classification Infrastructure (FCI) adds additional automation capabilities to the File Server Resource Manager (FSRM) role. In short, FCI enables FSRM to classify the sensitivity of individual files based on their content or location, while at the same time automating the elimination of files gone stale. Both go a long way in reducing the risk of storing sensitive business data.
First, install the File Server Resource Manager role service to your file server. This role service is a component of the File Services role. It was primarily used in the RTM version of Windows Server 2008 as the new location for quota management and storage reporting. Windows Server 2008 R2 adds two new nodes to this console; one for defining file classifications and the other for creating file management tasks.
What file classification does can best be understood by the way it is set up. Start by right-clicking the Classification Rules node and selecting the Create a New Rule option. A dialog box will appear asking for a name and description of the new rule, as well as which volume or folder location to scope. The real utility of classification comes from the second tab of the same name. Here, one of two classification mechanisms can be defined for this rule: Content Classifier and Folder Classifier.
Consider the data you have in your environment today. Some of that data is considered "sensitive" by your HR department, business, or some external regulatory agency. For example, your HR department likely considers personnel records to be sensitive documents. The same holds true for external regulations and personally-identifiable information (PII). While not necessarily PII, your business might consider specific project data as competition sensitive as well.
In most organizations, a best-effort solution to secure these types of data is usually attempted through the use of NTFS permissions. As long as data is stored in the right place, it will be protected by the permissions assigned to its containing folder. Unfortunately, these days it is far too easy to unintentionally transfer this data to an unsecured location.
For example, imagine someone from HR creates a new spreadsheet for a temporary use that contains sensitive data. If they store that spreadsheet on their desktop while they're working with it, the data is no longer safe in its "correct" storage location. The inadvertent spread of sensitive data through copy-and-paste or other simple uses over time can create a big pain for IT, not to mention findings for auditors.
By creating a classification rule, you are effectively drawing a line in the sand that states, "This data is sensitive, no matter where it goes." The rule does this by modifying the properties on the file itself. Thus, once a document has been classified, if it moves to a new location its properties remain with it.
Classification rules can do this in one of two ways. The first is through a Content Classifier rule. With this type of rule, you must identify a search term. If that term is present within any document in the rule's scope, then that document's properties will be modified to include the classification. With the Folder Classifier, any document that is stored in a particular location is considered classified.
The classification engine runs on a schedule that is configurable by you the administrator, with reporting done at its conclusion. Once a classification pass has completed, you'll find that your files now include the details in their properties to enable you to handle them with the special care their contents require.
The other exciting -- yet disturbingly powerful -- feature in FSRM is the new automation associated with file expiration. You've been in this situation before: you carve out a huge volume of storage for "shared" and "personal" file storage. Over a period of days that we'll call "n", that storage space approaches full capacity. Rather than adding additional expensive storage, you humbly ask your users to remove any files they no longer need or use. The response: crickets. Well, that and a storage location that still requires an expensive expansion.
The real insidiousness in this scenario is not necessarily the consumption of storage over those "n" days. It's the fact that by some magic of storage availability, your next storage upgrade will likely reach its full capacity in somewhere around ".7n" days. In effect, the more storage you buy, the faster your users consume it!
Microsoft's second new feature with FSRM helps with this problem, and can be found in the innocuously labeled File Management Tasks node. Right-clicking this node and choosing to create a new file management task brings forward a powerful new utility for eliminating or archiving files that no one has touched in ages. The File Management Task provides a location for running a command on every file in a storage location. That action can be to report on aged files, relocate them to an archival location, all the way up to complete deletion.
What's particularly useful about Microsoft's implementation of the File Management Task is that it's not necessarily directly designed around file expiry. As stated in Microsoft's Storage Team blog, "Basically, File Management Tasks are a mechanism to apply a single command to a selected set of files on a scheduled basis." The net result is that with a little crafting, you can build a solution that makes sense for your environment, whether that be notifying users of their files through email, running custom commands, creating reports on file expiry, or simply just deleting them out of spite. The flexibility is yours.
Designed for small, extendible for big
Yet another smart decision by Microsoft is the inclusion of extensibility points for third-parties to run with these new features. While enterprise environments are likely to need more comprehensive solutions for their data storage needs, the SMB IT professionals in the room will appreciate the minimum features freely available out of the box.
Microsoft suggests that additional points of expansion are built into the product for things like file migration rather than deletion or reporting, the optimization of backup SLAs, watermarking of documents, and the automated application of rights management to highly-sensitive documents, among others. With the release of Windows Server 2008 R2 on the horizon, plan on seeing partner companies running to market with expanded solutions to fit the needs of every-sized environment.
Next: DirectAccess means 'always on' in R2
ABOUT THE AUTHOR
Greg Shields, Microsoft MVP, is a partner at Concentrated Technology. Get more of Greg's Jack-of-all-Trades tips and tricks at www.ConcentratedTech.com.