Guest post by Stephen Brobst, CTO for Teradata Corporation
The early days of the Internet of Things (IoT) have largely been focused on sensor enablement and collection of data. However, the real value comes not from capturing and storing data – but when we take that data and do something with it. Value creation from the IoT comes from the Analytics of Things (AoT). In the Silicon Valley, we say that there are three choices for a company in the 21st century: (1) be a data company, (2) become a data company, or (3) become extinct. There are no other choices.
AoT value creation will accrue to both B2C and B2B enterprises. In the past twenty years the majority of the digerati (analytically sophisticated organizations) have been found in B2C enterprises such as dot-coms of all kinds, retailers, telcos, banks, and so on. The first two waves of big data analytics with clickstream data analytics (wave 1) and social media analytics (wave 2) primarily benefited enterprises specifically focused on understanding consumer behaviors. However, IoT sensor data (wave 3) is unique in its ability to drive value for understanding consumer behaviors as well as for understanding machine behaviors.
The “quantified self” movement encourages individuals to track data about themselves and their surroundings to better manage their lives. Everything from automobiles to toothbrushes to the human body can be participants in the Internet of Things. The use of telematics for improving healthcare costs and quality has huge potential – both for inpatient and outpatient care. In this scenario the sensor enabled human is the “edge device” and deep learning algorithms will be able to monitor for events, diagnose patient conditions, and prescribe interventive actions. This will make doctors much more efficient and provide the basis for much more effective and personalized care pathways.
IoT is a game changer for B2B enterprises. The use of analytics on sensor data to track assets, maximize yield in manufacturing processes, optimize logistics, and in the exploitation of many more opportunities in B2B enterprises is exploding. But this is really just the first phase of value creation for B2B enterprises. A lot of what is going on today is really more internally focused – the Intranet of Things – rather than the Internet of Things. In other words, early implementations tend to focus on opportunities within the enterprise to create improvements in efficiencies and quality. This approach is very often the least risk path to immediate return on investment. However, the true breakthroughs come when new business models are created with externally leveraged capabilities from the Analytics of Things.
An example of a breakthrough capability is the Internet of Trains initiative at Siemens. The essence of this breakthrough was to create a business model based on outcomes delivery rather than traditional B2B selling of assets. Revenue is generated by transporting passengers rather than delivering physical trains – and margins significantly improve when analytics are used to drive efficiencies in transportation outcomes. The Sinalytics platform developed by Siemens as part of its Digital Services initiative provides an innovative way manage a vast range of machines from trains to turbines to medical instrumentation and beyond – all made possible by an IoT foundation. The Sinalytics platform processes nearly 20 terabytes of data per month to optimize the operation of machines across multiple lines of business. The key differentiator in the world of Analytics of Things is the ability to process enormous volumes of data, identify critical patterns and relationships in the data, and deploy algorithms for prescribing actions that drive competitive advantage. As algorithms improve, so does margin in the business and the ability to create new data products.
Evolution of skills and analytic capabilities are required to fully exploit IoT data for value creation. The nature of IoT data places much more emphasis on real-time streaming with time-series capture of geospatial and other measurements that are outside the data structures typically used in legacy data warehouse implementations. The use of emerging algorithms in deep learning to handle the high dimensionality of IoT data is essential for advanced implementations. An ecosystem approach will generally be the most effective deployment model. Analytic model building will be most effective at an enterprise capability and then the models will be pushed out to the edge nodes in the IoT network for execution. Exceptions are captured at the edge and then pushed back to a centralized capability for re-calibration of the execution models. Learning happens much more quickly when experiences are pooled at an enterprise level, but execution of the predictive and prescriptive models at the edge will be necessary for obtaining the lowest possible latency and best efficiency when dealing with high volumes of real-time events.
There is a growing recognition that for enterprises to extract value from IoT data an AoT capability is required. Maximizing value creation, however, is far more than just about the technology. New business models need to be created and new skill sets need to be acquired. Gartner has estimated that through 2018, 80% of IoT implementations will squander transformational opportunities by focusing on narrow use cases and analytics. Broader thinking requires an ecosystem approach to AOT with data gathered and integrated across multiple sources. McKinsey estimates that between forty and sixty percent of IoT value will derive from interoperability between “things” on the internet. The winners in the digital economy will be those that exploit these opportunities.
Stephen Brobst is the Chief Technology Officer for Teradata Corporation. Stephen performed his graduate work in Computer Science at the Massachusetts Institute of Technology where his Masters and PhD research focused on high-performance parallel processing. He also completed an MBA with joint course and thesis work at the Harvard Business School and the MIT Sloan School of Management. Stephen is a TDWI Fellow and has been on the faculty of The Data Warehousing Institute since 1996.