Traditionally a sleepy, often sparsely populated part of the production floor landscape, edge computing is poised to become a hotbed of intelligence-gathering activity as manufacturers begin to architect and execute on their plans for the Industrial Internet of Things (IIoT).
The concept of edge computing has existed in some form or another for decades on the plant floor, but much of the technology—think PLCs and data historians—has been limited to collecting data and providing visibility into the operations of a singular, closed system. In addition, most traditional edge devices haven’t been powerful enough to support any kind of meaningful, extended analysis, and the type of data collected—periodic temperature checks on machinery or the recording of drill bit speeds, for example—often remains untapped because it’s difficult to access let alone pair up with additional resources that could deliver deeper operational insights.
Yet a number of factors are converging to transform the role of edge computing in manufacturing. Black boxes and other connected hardware devices are exponentially increasing in processing power, connectivity capabilities and memory, creating an edge platform that can support more sophisticated data collection and analysis on a local level in lieu of having to transmit data to a centralized processing system in the cloud or an off-site data center. In addition, as production floor machinery and remote assets become interconnected as part of a global enterprise and the broader IIoT, the amount and variability of data being harvested provides an unprecedented opportunity for manufacturers to generate new business insights that would be impossible to come by with traditional plant systems.
“The edge has been there all along, but as the piece parts shrink in size, increase in performance and come down in cost, there’s an opportunity to have intelligence on the edge that is game changing,” says David Houghton, director of business development at Bright Wolf, a provider of an M2M and IoT application development and deployment platform. “While the edge has been lit up with SCADA systems, that light is only available to that particular department, not necessarily to the broader enterprise. By connecting systems, you can light up the entire enterprise and keep your finger on the pulse of an organization, driving efficiencies, improving operations and boosting safety.”
In fact, IoT-connected devices, including those on the shop floor, are collecting a burgeoning treasure trove of data—200 exabytes in 2014 and expected to grow sevenfold by the end of the decade to surpass 1.6 zettabytes, according to research firm ABI Research. Most of the data is still being captured locally, not in the cloud, and that data has remained largely inaccessible for analytics, ABI analysts note. However, ABI research shows that advanced computing capabilities are paving the way for edge analytics that will replace gut-level decision-making with empirically driven intelligence.
ABI is forecasting the market value for IoT analytics to hit $23 billion by the end of the decade, fueling a shift from what it calls basic descriptive and diagnostic analytics like “what has happened” and “why did it happen” to more advanced intelligence that will drive predictive and prescriptive analytics. As edge analytics take root in industrial environments like manufacturing and the oil and gas sector, companies are poised to reap a number of benefits. These include reducing the amount—and subsequent cost—of network data transmission; improving security; and enabling a lower-latency response to real-time events like course correcting a piece of machinery on the production line or switching between physical assets in the field.
“The tyranny of cost, latency and security are driving the edge,” Houghton says. “Time is valuable and the network fees to pass data back and forth aren’t free. Companies need to have a more intelligent edge.”
Driving plant-floor optimization
Cisco advocates the growing importance of edge computing for IoT applications in manufacturing and oil and gas industries, in particular. Consider that offshore oil platforms generate between 1 and 2 TB of data daily, much of it time-sensitive related to platform production and drilling safety. Most providers in this space transmit data via satellite connections with data speeds ranging from 64 kbps to 2 Mbps, which means it could take up to 12 days to send one day’s worth of oil platform data to a central repository, not to mention the high cost of such a transmission, according to Cisco research. By leveraging edge computing and analytics as part of Cisco’s fog computing offerings, oil and gas operators could analyze data close to where it’s being collected, allowing them to quickly initiate a proper response to an equipment problem or transmit only the relevant data to a centralized repository for further analysis, Cisco officials say.
In fact, a recent Cisco survey shows business leaders increasingly recognizing the importance of edge computing/analytics as it applies to meeting key IoT business objectives. Forty percent of respondents said that most data coming off of IoT solutions will be processed at the edge of the network using intelligent devices as well as appliances. “The edge is where you handle real-time data flow and where you want to control a machine and take action,” says Paul Didier, Cisco’s manufacturing solutions architect.
On the plant floor, more potent edge computing can facilitate production line optimization and support remote and preventive maintenance. Collecting and analyzing data from multiple CNC machines on a plant floor could help determine that a particular part isn’t meeting performance standards and should be swapped out. At the same time, collecting information on voltage changes or actual energy consumption and then combining that data with similar intelligence on machines in geographically dispersed plant locations could yield patterns that would result in preventive maintenance steps or in shifting things around if production lines weren’t working at full capacity. According to a Cisco/SCM World survey, responding manufacturers projected a 48 percent reduction in unplanned downtime by tapping into these kinds of systems.
“In today’s world, profit margins are thin and people want to reduce costs,” says Daniel Liu, business development manager for Moxa, a provider of industrial networking and automation equipment. “People want to implement edge computing for data acquisition and to optimize the efficiency of their production lines. The less equipment failure, the more uptime you get.”
With a more intelligent edge, raw data collected from production equipment could be analyzed locally, minimizing data transmission costs. Historically, edge devices have only had enough processing power to collect data locally, requiring them to transmit back to a remote, centralized database for further processing and analysis, Liu says. “Companies send raw PLC data through a wireless network to a central database and, if they’re sending it over a 4G or 3G cellular network, every packet costs money, and if it’s sent over a satellite connection, it costs even more,” he explains. “If you have edge processing power, you can collect the data and do the necessary aggregation and only send the processed data back to a central location. That saves a tremendous amount of money.”
Beyond cost savings, edge computing can help reduce latency and enable real-time decision-making—both critical factors to maximizing equipment uptime. If a remote wind turbine experiences a fire, for example, companies can’t afford the wait between collecting data and sending it back to the cloud for analysis before they take action. “You can’t lose that real-time capability, particularly with industrial applications where things have to happen in milliseconds,” says Matt Newton, director of technical marketing at Opto 22, a provider of automation equipment, including the SNAP PAC System, which provides edge computing capabilities. “You can’t have latency built into the Internet.”
Though SCADA and other automation systems have advanced manufacturers’ ability to solve operational problems, they tend to be domain-specific, rely heavily on proprietary technologies, and typically bound to a silo, which limits access to the data they generate to a select few subject matter experts. Extending their reach through edge computing and IIoT enables manufacturers to bring in other data elements from additional pieces of equipment or from other resources, creating a foundation for smarter decision-making. For example, instead of logging temperature on a piece of machinery every few seconds and sending an alert when it hits 250 °C, having access to other data resources provides context, so the operator is only alerted when something goes beyond the range of normal. “Bringing disparate data together changes the role of the operator and enables better decisions vs. just having general reactions,” says Mark Bernardo, leader of professional services for America at GE Digital. “By capturing data from different sources and processes, you can do analysis and make it part of a living system used for decision support instead of just collecting and alerting on data.”
Making the edge work
Creating an intelligent edge requires more than advanced processing horsepower and additional storage packed into a refashioned black box. Along with the hardware, an effective edge device needs to encompass connectivity capabilities so it can ingest data from myriad devices using any number of proprietary process automation protocols and open communications standards. There is also a need for pre-processing capabilities for normalizing and managing disparate data types as well as analytics capabilities fashioned for edge resources.
Moxa’s edge computers are designed to handle a lot of that heavy lifting, enabling programmers to focus on driving analytics and business insights. The company’s embedded RISC computers feature a wide selection of wireless and cellular connectivity options and can be integrated with all standard automation protocols, including fieldbus, Modbus and SNMP. In addition, the Moxa edge platform delivers secure, encrypted Internet communications and offers an open programming platform to support custom applications.
“We bring everything together in an edge computer, including how to communicate through 4G or fieldbus and how to aggregate data so you don’t have to worry about any of that connectivity stuff,” Liu explains. “All of that is a distraction to programmers—their focus for IIoT should be on calculating and doing analysis on raw data, not dealing with fieldbus connectivity.”
ISS Connectivity sees its opportunity as helping manufacturers modernize existing assets, both in the plant and in the field, with next-generation edge computing capabilities for IIoT deployments. The company’s DeviceLynk installs an agent on any number of supported edge devices, enabling them to gather data from various field assets and industrial equipment using standard protocols. Once the data is collected, the agent pushes it to the corresponding cloud-hosted DeviceLynk application, which is used to create dashboards for actionable intelligence, according to Adam Strynadka, managing director of DeviceLynk. The firm offers a handful of dashboard and analytics solutions tuned for specific industries, including environmental monitoring, energy management, drilling management and water monitoring.
“There are a lot of things running this world—pumps, controllers and PLCs—that are 10 or 20 years old and they still have a useful life,” Strynadka says. “We’re about strapping on modern-day functionality like store and forward capabilities and security to unlock dark data and see what can be done with it. It’s the least expensive and most easily accessible way to gain value from IoT today.”
Like DeviceLynk, GE Digital doesn’t believe IIoT ultimately comes down to a choice between the cloud or edge computing, but rather a melding of both. Edge capabilities come into play to filter and condense massive data sets for relevant intelligence that can then be sent to the cloud for further analytics processing. The edge is also essential for normalizing aggregated data into a common data model. “A lot of people look at this as an ‘either or’ and you need to make it an `and’ proposition,” Bernardo says. “The edge is meant for enabling reactive decisions and the cloud is better suited for proactive analysis.”