Some emerging technologies and current tools could see more adoption as a way to improve enterprise security if properly used.
SearchWinIT.com interviewed Windows experts who gave their top picks for the emerging technology and tools with the greatest long-term promise of protecting Windows systems.
Chenxi Wang, a principal analyst in security and risk management at Cambridge, Mass.-based Forrester Research Inc., cited one OS security technology and another that shores up application security.
She cited remote attestation that is enabled by the Trusted Platform Module (TPM) chip technology. The TPM chip is based on standards promoted by the Trusted Computing Group and allows for storage of security devices such as passwords and encryption keys in the hardware.
Microsoft has used TPM technology as part of Windows Vista. Remote attestation is code that lets users know the hardware and software configurations, or integrity and security, of a remote host. When visiting a Web site, this technology would tell the user if the Web server configuration had been altered and was now compromised and so potentially dangerous.
Wang also said she thinks there is much promise in proof-carrying code, which can be part of a software application and serve as proof that the application does not harbor malicious code. "This is really neat because it is hard to make the proofs, so it would be incredibly difficult to make fake proofs, but it is easy to verify if the proof is valid," she said.
Peter Lindstrom, a senior analyst with the Burton Group, a research company based in Midvale, Utah, said that products using network access control, or NAC, haven't really taken off yet but he thinks that in five years NAC will be in every computer.
NAC technology comes in two basic flavors -- one called Network Admission Control made by Cisco Systems Inc. and a second called Network Access Protection, or NAP, from Microsoft, although there are other vendors as well. Both versions of the technology check to see that users connecting into a network are not infected with malware. They can quarantine the user's computer and update it before allowing it to connect to the network.
"This technology in general is perfect for vulnerability management in the endpoints," Lindstrom said. While vulnerability management technology hasn't changed much in the last 15 years, he said, using network access control technologies would increase security while forcing IT shops to rethink their security posture.
Because network access control can be used broadly either at the client or server level or both, it forces IT managers to think about what they're trying to accomplish by its implementation.
And a method of delivering applications that has been around for quite a while will get a much wider adoption for security purposes, according to Chris Schwartzbauer, an engineer who works in sales and marketing for Shavlik Technologies in Roseville, Minn.
Security software as a service will come into its own soon because people no longer have the same fears about downloading applications from the Web they had in the past, and the technology has become more flexible, making it easier to deliver, he said.
Government regulations are forcing companies to develop rigorous system configurations. Creating these configurations, in addition to providing a wide variety of anti-malware capabilities, will become more common as small and medium-sized enterprises need to meet these mandates but don't have the resources or knowledge to meet these regulations, Schwartzbauer said.
Virtualization, from the standpoint that it can help in securing the enterprise, is also something to watch. Tim Grance, manager of the security technology research program at the National Institute of Standards and Technology in Gaithersburg, Md., said the increases in processing power of today's integrated circuit chips means unused processing power could be diverted for virtualization. NIST is a non-regulatory federal agency within the U.S. Commerce Department.
IT departments can use the additional processing power to avoid application conflicts between legacy and newer applications and to test environments for patches, so any incompatibility problems are worked out before updating the system.
He said he expects it will become more common for IT departments to use virtualization to help integrate legacy applications without interfering with other applications and for more hardware enforcement of security measures.
And simply talking about technology is important because there remain gaps between policies, protection and goals that enterprises want and need and the nitty gritty work of making them a reality, said Diana Kelley, a lead security analyst at Burton Group.
"Sometimes it's a long path from the executive big picture of technology policy decisions and the 'bits and bytes' that are necessary to make it be true," Kelley said. "It's important that the feedback loop is closed and that everyone knows why it is being done."