Will Data Historians Die in a Wave of IIoT Disruption?

If data historian vendors want to avoid disruption, expand the user base, and deliver on the promise of IIoT use cases, solutions must bring together all types of data into a single environment that can drive next-generation applications.

Recent LNS Research survey results have shown that the Industrial Internet of Things (IIoT) is at the tipping point on the technology adoption curve, with 34 percent of companies either currently adopting or planning to adopt IIoT technology in the next year.

In consumer markets, the Internet of Things (IoT) has already disrupted incumbents; with Uber, Airbnb, and Nest being commonly cited $1B+ examples. In the industrial space, it is not as clear where and how quickly disruption will occur. There are a number of reasons why disruption will likely not occur as quickly, including asset and technology refresh rates. These rates continue to be measured in decades, not years, with solution selection processes that are measured in months, not minutes.

Even so, one area of the industrial software landscape that many believe is ripe for disruption is the data historian. The data historian emerged out of the process industries in the early 1980s as an efficient way to collect and store time-series data from production. Traditionally, values like temperature, pressure and flow were associated with physical assets, time stamped, compressed, and stored as tags. This data was then available for analysis, reporting and regulatory purposes.

Given the amount of data generated, a modest 5,000-tag installation that captures data on a per-second basis can generate 1 TB per year. Proprietary systems have proven superior to open relational databases, and the data historian market has grown continually over the past 35+ years.

The future may seem very bright for the data historian market, but there is disruption coming in the form of IIoT and industrial Big Data analytics.

As these systems have been rolled up from asset or plant-specific applications to enterprise applications, the main use cases have slightly expanded, but generally remained the same. Although there is undisputed incremental value associated with enterprise-level data historians, it is well short of the promise of IIoT.

In our recent post on Big Data analytics in manufacturing, I argued that Big Data is just one component of the IIoT Platform, and that volume and velocity are just two components of Big Data. The other (and most important) component of Big Data is variety, making the three types structured, unstructured and semi-structured. In this view of the world, data historians provide volume and velocity, but not variety.

If data historian vendors want to avoid disruption, expand the user base, and deliver on the promise of IIoT use cases, solutions must bring together all three types of data into a single environment that can drive next-generation applications that span the value chain.

It is unlikely that the data historian will die any time soon. It is, however, highly likely that disruption is coming, making the real question twofold: Will the data historian be a central component of the IIoT and Big Data story? Which type of vendor is best positioned to capture future growth—traditional pure-play data historian provider, traditional automation provider with data historian offerings, or disruptive IIoT provider?

If the data historian is going to take a leadership role in the IIoT platform and meet the needs of end users, providers in the space will have to develop next-generation solutions that address the following:

  • How to provide a Big Data solution that goes beyond semi-structured time-series data and includes structured transactional system data and unstructured web and machine data.
  • How to transition to a business/pricing model that is viable in a cheap sensor, ubiquitous connectivity, and cheap storage world.
  • How to enable next-generation enterprise applications that expand the user base from process engineers.

And, if it is a traditional vendor (pure play or automation) that succeeds in delivering a successful next-gen solution, the company must be willing to disrupt existing cash cows (highly profitable software or hardware offerings) and extend outside of its traditional comfort zone (time series data and control applications). Both of these are huge demands considering the conservative nature of the industry.

But even so, this isn’t impossible when you consider the incentives provided by the financial momentum some disruptors have (Splunk in seven years has more revenue than OSISoft in 35) and the executive focus some early mover traditional incumbents have (GE CEO Jeff Immelt has bet billions on IIoT and Big Data analytics with the GE Digital re-org).

>> Matthew Littlefield, matthew.littlefield@lnsresearch.com, is president and principal analyst of LNS Research.

 

Companies in this article
More in Data