By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Hardware technology advances in the IT universe have grown at a more or less exponential rate over the past few decades (although this may be slowing noticeably in the silicon domain). This has a substantial effect on standards.
There are two types of standards: De jure and de facto.
De jure standards, agreed upon and published by appropriate international standards' bodies, may be the rule in some domains. But data processing is dominated by de facto standards, which are driven by market realities. Those de facto standards are shaped by the introduction of successful products that somehow met a need and became accepted by buyers in the market. Factors involved in such acceptance include the cost and influence of those offering the goodies.
An illustrative example of real-world standards' dynamics is the nigh-universal adoption of TCP/IP despite a set of proposals created by industry experts and promulgated by the ISO (International Standards Organization). Of course, some international standards are simply casting into de-jure form some widely accepted de-facto standard. This makes the standard none the worse.
The presence of both de-facto and de-jure standards makes decisions all the more difficult, leaving it up to managers to choose among a number of options offered by competing companies, none of which have a dominant position in the marketplace.
Well-established standards offer stability, but emerging standards represent a business and technology risk. To help contain this risk, companies should consider establishing some form of regular market and technology review. Of course, doing this internally requires company resources, and can prove costly. Doing it indirectly by relying on external analysts can be cheaper, but it presents another risk because such analysts can be strongly influenced by fads and hype.
The phenomenon of technology evolution and marketplace hype is well established and has been captured in a semi-formal manner by Gartner, whose analysis we summarize here.
Technologies have a lifecycle. The first phase is emergence, characterized by the existence of many players, a fair number of which are typically startups. If there is some market acceptance, the technology can attain a degree of maturity and the result is a concentration of companies offering the technology. Disappearing companies may go out of business or be absorbed by more successful players, defining the growth phase. In the final phase, saturation, the technology has become mainstream, technological advances slow remarkably and the marketplace is dominated by a rather small number of companies that compete on economies of scale rather than proprietary technological edge. This evolution is illustrated in the following diagram.
You have to make decisions about which technologies to embrace and when. A key component of such decision making is to understand what phase the technology is in. In emergence, risks are high and it is generally best to avoid selecting such technologies at all. In growth, there are fewer vendor choices but each is of lower risk than in emergence. You can use business knowledge to choose one supplier and gain, at reasonable risk, advantages in areas such as cost or performance.
However, there is another dimension to new technologies: How the technology is perceived rather than what it is capable of. Gartner captures the dynamics of technology perception with their Hype Cycle:
The moment a technology begins to show some potential, it becomes an area of interest to the marketplace, analysts and investors. The varying interests of these parties, coupled with the desire to make substantial money, drives a phased sequence of perceptions for a technology. The first phase is the ability to demonstrate some technological capabilities; the Technology Trigger. Demonstrations are often no more than laboratory experiments, and they can be fairly limited in how widely they can be applied and how long the machinery remains operational. Observers tend to take the minimal results and extrapolate enthusiastically (and in public) to future possibilities, driving the perception of the technology into the second phase: The Peak of Inflated Expectations. That then results in widespread passion and enormous investments. As the early technology is used by early adopters inflamed by the expectations, some shortfalls are generally noted; these become bruited about, and the technology perception slumps into the Trough of Disillusionment. If the technology has merit, it will persevere and slowly rise up through the traditional technology cycle with its perception progressing through the Slope of Enlightenment as it becomes accepted to the Plateau of Productivity where it is considered a commodity and accepted everywhere.
It is, therefore, important for decision makers to disentangle hype from reality and make decisions based on fact. Since technology change is an industry fundamental, we will have more to say on it later.
About the authors:
René J Chevance is an independent consultant. He formerly worked as chief Scientist of Bull, a European-based global IT supplier.
Pete Wilson is Chief Scientist of Kiva Design, a small consultancy and research company specializing in issues surrounding the move to multi-core computing platforms, with special emphasis on the embedded space. Prior to that, he spent seven years at Motorola/Freescale.