It looks like SDM is becoming more real. Explain why administrators need to care about it?
First of all, Visual Studio 2005 [now in second beta and due out later this year] will be the first tool that will encapsulate the SDM. Visual Studio is relevant to administrators in this way. If you talk about the interactions they have with the development community today, in the best case, they are sitting in meetings and putting [specifications] into documents that say, 'If you're building an application here is what you need to know about our environment.'
Now we are taking it a step further by building this into our software tools. A person from IT can define a set of models that look like their deployment environment. They can put standards into their deployment environment and actually can provide that to a developer who can import that in code format into Visual Studio.
When they are building their application they can do this design-time validation that lets the
admin know if the configurations that are specified will work with the application. And from a
developer's standpoint, I don't have to be called in on weekends when IT discovers it doesn't work.
Then SDM is a template.
I'll use Web services as a simple example. I run and support some standard configuration for IIS [Internet Information Services] in my data center. I have certain security reasons why I choose a particular setting. I can create a model that represents what
Now this is built into the software tool. When you develop applications, you can test [them]. You can drop the application onto that model. It will bounce back and tell you, no, it's not deployable because you require Kerberos certification and that's not configured on the IIS servers. Or, where it says you are trying to connect to SQL and that port isn't open in the data center where you are deployed. It will tell you where there are constraints being broken.
That's a way for you as an administrator to provide input into the development, and to make sure
all the applications you are going to deploy are going to be deployable. We've already delivered an
SDK [software developer kit] with Visual Studio so that people can start creating their own SDMs.
Microsoft now says it will soon expand this to its manageability tools?
The core management packs we use for the next version of MOM [Microsoft Operations Manager] will be described in SDM. So now I start to do true service-level monitoring with MOM. No more just an individual component. Now I can model that complete service, all the application components, the servers they reside on, the network topology, the storage -- I can manage it from the service level.
So, administrators will have the view of the entire service. When there is a problem, we can help. We provide probable-cause resolution. If there is a problem, we can get you close to the root.
On the SMS side, we will take a natural step in the desired configuration monitoring. If I have
a certain configuration I want to make sure my users stay within, SMS can monitor those desktops
and make sure they don't move from that model. It can let IT admins know ahead of time that someone
changed a parameter, that something impacted the supportability of that desktop. You're making some
big strides in terms of platform interoperability.
Interoperability is pretty important for administrators. We know they don't run all-Windows environments. The big proof [in Microsoft CEO Steve Ballmer's MMS 2005 keynote] was using MOM to manage a Sun [Microsystems] server using the WS-Management protocol.
Technically, all those pieces coming together will take a while and we are going to do our part with Windows Server 2003 R2 [in second beta and due out later this year]. We will ship in the platform some core capabilities, such as support for WS-Management. We need parties like Sun to manage it. We need support in our management tools. What comes together over time is an end-to-end solution.
Another interesting interoperability point was around our third-party OS support for virtual
machines. We are actually working with partners to enable Linux, Solaris, etc., to be tuned to
perform well in our guest [virtual machine] environment. In his keynote, Steve Ballmer mentioned
that virtualization was going to be an area where Microsoft makes some of its biggest investments.
What are some of the key technologies involved there?
One area is increased investment in the core area in the hypervisor. This is a thin piece of software that performs basic resource management in the hardware. This technology has been around for a few years. Mainframes have had them for a long time. It is enabling us and Intel [Corp.] and AMD [Inc.] [to] introduce extensions that make it easier for a hypervisor to be written. It's another natural evolution from the OS standpoint.
As people start using more virtual machines, the more they think about how the VMs will be managed. Virtualization is only as powerful as the management tools. So we are upping our investment. Customers already have tools to manage servers and desktops and they want the ability to manage VMs in existing tools. The [Service Pack 1] in SMS gives a report that shows physical and virtual boxes. MOM has this ability.
Customers want to use virtualization technologies in much broader applications than previously thought. They want performance improvements, manageability improvements, support for heterogeneous environments. It's only been six months and we have a [Virtual Server 2005 SP1 beta] that actually hits on all of these customer requirements.
IT resource virtualization is a huge topic. A lot of technologies will come into play. VMs are
going to be a critical part but they won't be the only thing that solves the problem. There are
storage resource virtualization technologies. We have a Windows systems resource manager. In
general, the pain people are feeling is that they have a lot of hardware that is underutilized.
There will be a lot of tools to bring to bear on that. Microsoft now says it will align and promote
interoperability between its Network Access Protection protocol and the Trusted Computing Group's
Trusted Network Connect architecture. Cisco Systems is not part of that group. What's the latest on
your discussions with Cisco on making sure that Cisco's Network Admissions Control works with NAP
Last October, we said we were working with Cisco. I cannot speak as to why they are not participating with the TCG, but there are a considerable number of vendors there. We will continue to do that work to make sure the two are aligned. We have a good dialog going with TCG and with Cisco.
Where do the three come together? We are working on NAP to be included with Longhorn server. We will align with Cisco, and with TCG. How do TCG and the Cisco world align so customers have only one solution? We are still working on that.