Cloud computing, cargo transshipment, and fruit flies. Three topics
you wouldn’t expect to find in the same article. But our focus on cloud computing and its role in the enterprise has unearthed interesting parallels between them.
The history of logistics since the Industrial Revolution and the rise of the railroads is one of fits and starts around standardization. Any system of cargo transshipment seeks to minimize the handling of individual items across the modalities of transport. But it has been a slow, laborious process to remove unnecessary handling, because legacy infrastructure is costly to replace.
Consider railway gauges—the distance between the inner sides of the two parallel rails that compose a railway line. Initially, decisions about gauges were determined locally, and many different gauges emerged. The downside? Break of gauge—the term used to describe the process of moving all people and cargo between trains running on different gauges.
An example of break of gauge is on the Trans-Mongolian Railway, where Mongolia uses broad gauge and China uses standard gauge. According to Wikipedia, at the border, “each carriage has to be lifted in turn to have its bogies changed and the whole operation, combined with passport and customs control, can take several hours.”
Broad standardization of railway gauges has greatly reduced such inefficiencies around the world. But cargo often uses multiple transport modalities, including trains, trucks, and ships. Changing modalities was once equivalent to break of gauge. Fortunately, the rise of standards for intermodal containers has continued the trend of more efficient transport that began with gauge standardization. Movement of cargo is now “loosely coupled” with the underlying mode of transport.
Enterprise computing today is much like the early days of railways. Compute environments are purpose-built collections of technology dedicated to individual workloads. Individual pieces might be “standard,” but the overall environment is one of a kind. Cloud computing offers relief. But not as you might expect, given all the press devoted to external cloud services. Enterprises need to learn the lessons of logistics standardization and modularization. Doing so will empower IT to deliver much higher responsiveness and greater financial flexibility to the business.
Who are some of the early adopters of this approach? Here’s where fruit flies come in. Drosophila melanogaster, commonly known as the fruit fly, transformed the study of genetics back in 1910. Thomas Hunt Morgan’s research in the “fly rooms” at Columbia University elucidated many basic principles of heredity, including sex-linked inheritance, epistasis, multiple alleles, and gene mapping. What was the fruit fly’s contribution? Short, 10-day intergenerational time periods and high birthrates (females lay 100 eggs per day) meant researchers could track gene expressions over many generations in a matter of a few months.
Bechtel, the engineering and construction giant, is to data centers as the field of genetics is to fruit flies. Bechtel’s initiation of large, multibillion-dollar, multiyear projects requires the frequent provisioning of project data centers and application environments. Each effort is a “learning opportunity.”
Bechtel’s big takeaway? Traditional best practices for deploying and managing data center environments always impact cost, complexity, and business agility. Geir Ramleth, Bechtel’s CIO, recognized that cloud service providers such as Amazon.com and Google offered a new set of best practices that most enterprises weren’t adopting. With a new generation of projects always stacked up in front of them, Ramleth realized that Bechtel had an opportunity to design a new pattern of DNA for data centers that would deliver much higher business value.
This issue of the Technology Forecast covers the emerging trends associated with cloud computing. The first article describes Bechtel’s journey and introduces the Evergreen IT concept—the idea that the goal isn’t cloud computing but a new approach to the provisioning and management of IT that avoids the creation of legacy complexities and cost.
The second article takes the Evergreen IT concept down into the details of the technology that can support such an approach. We examine two broad technology domains, virtualization and data center automation, and explain their key roles in delivering Evergreen IT.
The third article describes the way forward from the CIO’s perspective. Although there will be no single path, we suggest five broad stages of transformation most companies must accomplish to reach the goal of Evergreen IT.
As always, our articles are supported by in-depth interviews with leading executives and thought leaders defining the future of IT. Geir Ramleth of Bechtel shares his vision and experience of transforming the Bechtel IT organization from its legacy roots and toward the vision of Evergreen IT. Erich Clementi and Irving Wladawsky-Berger of IBM emphasize how standardization and mass customization principles will reduce complexity and industrialize the IT function. Simon Crosby of Citrix describes how virtualization creates the separation between the layers of the IT stack necessary for Evergreen IT. Kirill Sheynkman of Elastra explains how intelligent software he is building will model and automate data center operations. Russ Daniels of EDS, an HP company, shares his insights about how IT needs to move from a build-to-order culture to a configure-to-order culture. Doug Hauger of Microsoft notes how cloud computing represents a technology and business model shift.
Please visit pwc.com/techforecast to find these articles and other issues of the Technology Forecast. If you would like to receive future issues of the Technology Forecast as a PDF attachment in your e-mail box, you can sign up at pwc.com/techforecast/subscribe.
And as always, we welcome your feedback on this issue of the Technology Forecast and your ideas for where we should focus our research and analysis in the future.