The Internet of Things (IoT) is approaching a tipping point. The IoT market is projected to grow from $33 billion in 2013 to $71 billion in 2018, according to Juniper Research. Cisco predicts that some 25 billion devices will be connected by 2015, and 50 billion by 2020. IoT will enable smarter automation that allows “things” to share information, learn about their surroundings, and auto-tune themselves to achieve optimum throughput and minimal downtime.
With everything from thermostats to pipelines connected and communicating, the question most people fixate on is: How will we be able to collect and capitalize on the resulting mass of machine data?
But before you start wondering about how to tackle all that data, it’s important to first understand how the extension of Industrial IoT beyond building automation, Wi-Fi and central data stores is raising bandwidth and connectivity challenges for telemetry systems. Remote device communications typically leverage a mixture of wireless, radio, fiber optic, satellite and telephone services. Bandwidth constraints of these different technologies have the potential to significantly degrade the data being communicated through them. Other factors like physical obstructions, weather and environmental elements can also result in a loss of connectivity and therefore data.
Industry also faces a major challenge associated with prioritizing the storage and communications of collected data. After all, not every device produces enough data or high-priority data to warrant constant live streaming to central storage. Therefore, it is essential to preserve data at the point of collection. This protects against loss of data in the event that connectivity is lost. It also ensures that potentially lower priority machine data can still be compiled and communicated as packages over time.
Data collected at the device or sensor has an increasing distance, number of paths, and elements to traverse before it makes its way to enterprise storage for deep analysis. With added distance to travel and complexity of architecture come more potential points of failure or data loss, potentially limiting the ultimate value of the IoT. With these considerations in mind, it becomes easy to understand why local data storage may be the key to the successful implementation and capitalization of the IoT. A tiered architecture is ideal to support local data collection, storage and communication—think of it as multiple mini-data centers.
Local historians that collect data at the point of creation provide an economical way to support IoT data collection across a dispersed architecture. Ideally, a local historian can buffer non-deliverable data until a later time and support flexible delivery. It could also identify an alternate available delivery channel. Plus, local historians can be tapped on-site for troubleshooting devices or performing basic data mining in the field related to operations.
Though local historians will prove important to IoT success, they will still need to feed up to centralized or enterprise data repositories for long-term storage and more in-depth analysis. Not surprisingly, we are seeing these repositories migrating to the cloud for flexibility and the ability to provide additional bandwidth as needed. The power of cloud computing will make the required storage and processing power available to handle the zettabytes of data resulting from the continuous build-out of IoT systems.
As the IoT continues to proliferate across consumer, enterprise and industrial installations, we will likely see distributed data architectures become the norm, with local historians playing a key role in this increasingly dispersed, foundational architecture. While those who insist upon a centralized architecture will likely fail or at least struggle to support and leverage the deluge of connected devices, those willing to innovate and address the challenges of a distributed architecture will be able to capitalize on the connected world.