As talk of how the Industrial Internet of Things (IIoT) will create greater efficiency in manufacturing takes center stage in conference rooms and boardrooms across the world, operations managers have been left scratching their heads as they try to figure out how all of these “things” will work together.
Here’s the problem on the plant floor: If you want to connect a specific flowmeter to the network, for example, you have to have a driver for that fieldbus. Considering there are usually many different fieldbus flavors floating around the factory floor, connecting different devices is hard to do. On top of that, programmable logic controllers (PLCs) are still proprietary and require programming to integrate with the things they control. And communicating to the cloud is not a plug-and-play scenario—at least not yet.
Here’s the good news: This year we’ve seen industry groups collaborating to close the IIoT “communication gap” by getting separate packaging, control and communication standards to connect. There is also a whole landscape of lightweight messaging middleware emerging designed to link SCADA systems and devices via a publish-subscribe model that relies on infrastructure, not applications. In addition, data sharing platforms that couple communication and Big Data analytics offer up an IIoT framework for creating a scalable, high-performance system that maintains quality of service.
All of this serves as a springboard for what could be easy IIoT connectivity.
Still, some industry observers might look at the evolving options and think the industry is just setting itself up for a new kind of “fieldbus war” in which there are competing IIoT communication protocols. But the developers of these ideas are focused on open architectures, which rips a page from the IT playbook.
“The IT world is well standardized in that I can have a PC, a Mac or Linux, and I use the same connection to the Internet to pick up a web page. It all looks the same no matter what I’m using,” says Bryan Griffen, group engineering manager at Nestlé USA and chairman of the Organization for Machine Automation and Control (OMAC). “The automation world does not have that. We have vendor-driven technology platforms that don’t always interact.”
An open architecture relies on more than just standards, however. It could also require open access to data that is stored in a non-proprietary format, and openness to connect to anything in a facility regardless of the vendor. These issues are being tackled from different angles. But regardless of the approach, the goal is the same—to take the guesswork out of IIoT implementations without reengineering any industrial equipment.
Plant floor lingua franca
Industry standards have popped up on the plant floor over the years as a way to ease integration. But they are very vertical in nature. The Packaging Machine Language (PackML), for example, developed by OMAC, is a technical standard (ISA-TR88.00.02) for consistent control of packaging machines. PackML defines machine modes, states and tag naming conventions, but it does not specify a communications protocol.
OPC Foundation’s Unified Architecture (UA), on the other hand, is an industrial interoperability framework that delivers information modeling with integrated security, access rights and machine-to-machine (M2M) communication. OPC and PLCopen—which is focused around IEC 61131-3, the global standard for industrial control programming—recently worked together to define a set of function blocks to map the IEC 61131-3 standard for industrial controls programming to the OPC UA information communication model.
Now, the OPC, PLCopen and OMAC are collaborating to create a companion specification that creates a common way to communicate across disparate standards. This plant floor “lingua franca” will be able to get PackML information onto the network using OPC UA so that it can be picked up by machines, manufacturing execution systems (MES), the cloud and data storage. From there, that data is handed over to PLCopen to codify it into function blocks using the open programming standards for PLCs.
“I spend so much money on different systems on the shop floor that I can’t even begin to worry about getting data into the cloud. It’s too expensive because I don’t have a common protocol,” Griffen says. “That’s what OMAC, PLCopen and OPC are trying to [solve].”
It is not necessarily another protocol that the trio are creating, but rather new rules of engagement. The companion spec is a set of guidelines that describe how data is shared if you want machines to talk to each other. “The idea is not to reinvent the wheel; it is to say these standards exist, let’s use them before people just try to invent a new standard,” says John Kowal, director for B&R Industrial Automation and a member of the OMAC board of directors.
A task force has been formed to deliver the complete spec by January 2017. Taking it a step further, Kowal, who is also co-chairing a new group within the Industrial Internet Consortium (IIC) called the Smart Factory Task Group, is trying to make sure the efforts underway end up in a smart factory architecture. The consortium covers everything from agriculture to mass transit and power generation. “We have needs in manufacturing that they don’t have in other aspects of IIoT,” he says. “We deal with determinism, speed, data integrity. We want to make sure these things are taken into consideration.”
The middle man
As industry groups hustle to layer new forms of open access on top of existing industrial equipment, suppliers from the IT side are taking their place in the IIoT race by leveraging middleware messaging technologies.
Message Queuing Telemetry Transport (MQTT), for example, is a messaging protocol that defines how to transport data. MQTT is one of a handful of messaging methodologies that can be used for IIoT apps. Other messaging technologies include Hypertext Transfer Protocol (HTTP), Advanced Message Queuing Protocol (AMQP), Constrained Application Protocol (CoAP), Representational State Transfer (REST), Java Message Service (JMS) and Data Distribution Service (DDS).
Some of these are very industry- or application-specific. AMQP, for example, is a message-centric protocol for the financial sector. CoAP is a document transfer protocol designed for use with very simple electronic devices. But MQTT, created in 1999 to connect oil pipelines over unreliable satellite networks, is now a standard from the Organization for the Advancement of Structured Information Standards (OASIS).
MQTT is lightweight, open, easy to implement and ideal for constrained environments, like the factory floor, where network bandwidth is at a premium. The use of publish-subscribe patterns provides one-to-many message distribution. Specifically, it allows devices to send information to a server that functions as a message broker. The MQTT broker then pushes the information to clients that have subscribed to the client’s topic. The MQTT server and MQTT client interact with each other and connect based on the information they subscribe to.
The key here is the decoupling of applications from endpoint devices. “We are not connected to an application,” says Arlen Nipper, president and CTO of Cirrus Link Solutions and the co-creator of MQTT. “It doesn’t matter what you are looking at—it could be Microsoft Azure, an IoT hub, an edge network device—everyone is an MQTT client connected to a server.”
MQTT was originally developed for a SCADA system, and more recently, Cirrus Link has developed MQTT modules for Inductive Automation’s Ignition SCADA system, allowing users to set up an IIoT environment that easily connects to devices without disrupting existing operations. In conjunction with the messaging transport, the company introduced its Sparkplug specification that adds contextual information through metadata that describes what is being transmitted from the device to Ignition.
So is it really that easy? If you ask David Pitzer, director of automation at Tyrion Integration (a system integration services company), the answer is “Yes.”
The integrator was working with Pros, a well testing company that conducts well inspections for oil producers. Typically, Pros goes out to the site and hooks up to the pipes to run fluid through the tester, collecting data on the flow of oil, water and gas to provide performance metrics to the client. For decades, this process was done manually with operators writing down numbers to be put into a spreadsheet.
|Tyrion Integration’s Nucleus, which takes the place of cellular modem, PC and HMI, connects the Pros oil well testing equipment to the cloud via MQTT. It works in combination with the Inductive Automation Ignition gateways to let managers see the same real-time data as the operators in the oil field.|
In response to Pros’ request for an automated way to collect information, Tyrion developed a cloud-based web application that provides real-time remote monitoring and control of each test unit. The system, which is battery-powered and runs off of solar power, includes Tyrion’s Nucleus device, which takes the place of a cellular modem, PLC, industrial PC and HMI. Nucleus works in combination with the Inductive Automation Ignition gateways, which connect a cloud Ignition server via MQTT. Now, managers in the office can look at the same real-time data as the operators in the field. And clients can see well tests as they are happening rather than waiting a few days for results.
Unlike bandwidth-intensive client/server setups, the MQTT publish-subscribe method takes only the necessary data. “The MQTT client doesn’t say, ‘This is all of the information I have.’ It says, ‘I’m looking for this information’ and the broker will funnel the related data back to it,” Pitzer says.
What about the REST?
While there are many options for IIoT integration, Benson Hougland, vice president at Opto 22, a maker of controllers, I/O, solid-state relays and software that link electrical, mechanical and electronic devices to networks, notes that there will be three methods people will migrate to: OPC UA, MQTT and the RESTful application program interface (API). “OPC is in the plant, MQTT is useful in the cloud, but RESTful API is used all over the web,” he says.
“APIs allow people to quickly access data from legacy systems,” Hougland says. “Rather than going through other layers like OPC, you can make a direct call.”
And what about feeding the IIoT analytics beast? Deciphering Big Data from all the little data the “things” generate is an important piece of the puzzle. That’s where some of these IIoT platforms, like PrismTech’s Vortex Intelligent Data Sharing Platform, come in. Vortex enables real-time data sharing between devices and machines based on Object Management Group’s (OMG) Data Distribution Service (DDS). It provides the middleware that sits between edge devices and Big Data analytics, regardless of where the data resides.
PrismTech’s use of DDS means that it is a data-centric model or, in other words, the middleware understands the context of the data and ensures all interested subscribers have the correct view.
In addition, a gateway between DDS and OPC UA is in the works, says PrismTech CTO Angelo Corsaro, explaining it will fill the gap around mapping data from one communication technology to another. It also serves as a reminder that many of these technologies for IIoT interoperability are not competing against each other. They will more likely work in combination.
“People ask, ‘If I have MQTT, why do I need OPC UA?’” says Tom Burke president and executive director of the OPC Foundation. “Well, as an example, you can plug your digital camera into a laptop [without] the memory card and the raw communication works, but the other side has to understand what that data is trying to read in order to do something intelligent with it. OPC UA provides the syntax and the semantics for the information. It is the intelligence on top of the communication protocol.”
Industry observers agree that there is room for all of the IIoT integration options, but ultimately, it is the manufacturers using the technology who will decide the best approach.