Message from the editor

Tom DeGarmo

Tom DeGarmo
US Technology Consulting Leader

The right data + the right resolution =
a new culture of inquiry

James Balog1 may have more influence on the global warming debate than any scientist or politician. By using time-lapse photographic essays of shrinking glaciers, he brings art and science together to produce striking visualizations of real changes to the planet. In 60 seconds, Balog shows changes to glaciers that take place over a period of many years—introducing forehead-slapping insight to a topic that can be as difficult to see as carbon dioxide. Part of his success can be credited to creating the right perspective. If the photographs had been taken too close to or too far away from the glaciers, the insight would have been lost. Data at the right resolution is the key.

Glaciers are immense, at times more than a mile deep. Amyloid particles that are the likely cause of Alzheimer’s disease sit at the other end of the size spectrum. Scientists’ understanding of the role of amyloid particles in Alzheimer’s has relied heavily on technologies such as scanning tunneling microscopes.2 These devices generate visual data at sufficient resolution so that scientists can fully explore the physical geometry of amyloid particles in relation to the brain’s neurons. Once again, data at the right resolution together with the ability to visually understand a phenomenon are moving science forward.

Science has long focused on data-driven understanding of phenomenon. It’s called the scientific method. Enterprises also use data for the purposes of understanding their business outcomes and, more recently, the effectiveness and efficiency of their business processes. But because running a business is not the same as running a science experiment, there has long been a divergence between analytics as applied to science and the methods and processes that define analytics in the enterprise.

This difference partly has been a question of scale and instrumentation. Even a large science experiment (setting aside the Large Hadron Collider) will introduce sufficient control around the inquiry of interest to limit the amount of data collected and analyzed. Any large enterprise comprises tens of thousands of moving parts, from individual employees to customers to suppliers to products and services. Measuring and retaining the data on all aspects of an enterprise over all relevant periods of time are still extremely challenging, even with today’s IT capacities.

But targeting the most important determinants of success in an enterprise context for greater instrumentation—often customer information—can be and is being done today. And with Moore’s Law continuing to pay dividends, this instrumentation will expand in the future. In the process, and with careful attention to the appropriate resolution of the data being collected, enterprises that have relied entirely on the art of management will increasingly blend in the science of advanced analytics. Not surprisingly, the new role emerging in the enterprise to support these efforts is often called a “data scientist.”

This issue of the Technology Forecast examines advanced analytics through this lens of increasing instrumentation. PwC’s view is that the flow of data at this new, more complete level of resolution travels in an arc beginning with big data techniques (including NoSQL and in-memory databases), through advanced statistical packages (from the traditional SPSS and SAS to open source offerings such as R), to analytic visualization tools that put interactive graphics in the control of business unit specialists. This arc is positioning the enterprise to establish a new culture of inquiry, where decisions are driven by analytical precision that rivals scientific insight.

The first article, “The third wave of customer analytics,” on page 06 reviews the impact of basic computing trends on emerging analytics technologies. Enterprises have an unprecedented opportunity to reshape how business gets done, especially when it comes to customers. The second article, “The art and science of new analytics technology,” on page 30 explores the mix of different techniques involved in making the insights gained from analytics more useful, relevant, and visible. Some of these techniques are clearly in the data science realm, while others are more art than science. The article, “Natural language processing and social media intelligence,” on page 44 reviews many different language analytics techniques in use for social media and considers how combinations of these can be most effective.“How CIOs can build the foundation for a data science culture” on page 58 considers new analytics as an unusually promising opportunity for CIOs. In the best case scenario, the IT organization can become the go-to group, and the CIO can become the true information leader again.

This issue also includes interviews with executives who are using new analytics technologies and with subject matter experts who have been at the forefront of development in this area:

  • Mike Driscoll of Metamarkets considers how NoSQL and other analytics methods are improving query speed and providing greater freedom to explore.
  • Jon Slade of the Financial Times (FT.com) discusses the benefits of cloud analytics for online ad placement and pricing.
  • Jock Mackinlay of Tableau Software describes the techniques behind interactive visualization and how more of the workforce can become engaged in analytics.
  • Ashwin Rangan of Edwards Lifesciences highlights new ways that medical devices can be instrumented and how new business models can evolve.

Please visit pwc.com/techforecast to find these articles and other issues of the Technology Forecast online. If you would like to receive future issues of this quarterly publication as a PDF attachment, you can sign up at pwc.com/techforecast/subscribe.

As always, we welcome your feedback and your ideas for future research and analysis topics to cover.

Tom DeGarmo

Tom DeGarmo
US Technology Consulting Leader

1 http://www.jamesbalog.com/.

2 Davide Brambilla, et al., “Nanotechnologies for Alzheimer’s disease: diagnosis, therapy, and safety issues,” Nanomedicine: Nanotechnology, Biology and Medicine 7, no. 5 (2011): 521–540.