Fog Computing: Intelligence at the Edge

Aug. 12, 2016
To achieve the business benefits promised by the Internet of Things, three core problems must first be addressed.

On the train to work, Lee opens an email on her smartphone sent from a programmable automation controller (PAC) operating a surface-mount tool at her factory. The PAC has attached a quality control report to the email that suggests changing the tool’s solder temperature.

To generate that email suggestion, the PAC had securely sent yesterday’s production data to a cloud-based analytics system to compare current and historical data for the machine. Next, it accessed the machine manufacturer’s website and obtained the latest recommended settings. Finally, the PAC built a production efficiency report with a suggested solder temperature for today’s production run that would increase yield by 7 percent over yesterday’s run.

Lee clicks a link in the email and connects to the PAC’s mobile interface over a secure, encrypted channel. She logs in and navigates to the machine’s solder temperature set point, where she enters the recommended value.

All this takes place before she gets to the office.

That PAC operating the surface-mount tool at Lee’s factory operates at the edge of the factory’s network. Systems like these at the network edge are increasingly able to leverage cloud-based resources to perform fog computing—where computing resources exist as needed along the path from a sensor to the cloud to reduce the total amount of data that needs to be sent to the cloud for storage, processing and analysis. As a result, businesses can more quickly identify real opportunities for operational efficiency improvement and meaningful revenue generation.

To foster such business benefits, data from the physical world of machines and equipment must be available to the digital world of the Internet and information technology (IT) systems, quickly, easily and continuously. Successful IoT applications require operational technology (OT) professionals to make data from their systems—which monitor and control the physical world—accessible to the data processing systems of IT professionals.

Once the data is there, cognitive prognostics algorithms running on IT systems can analyze it, refining raw physical data into actionable information that can predict outcomes in real time. The results can be used to improve inventory management and predictive maintenance and reduce asset downtime.

But before such benefits can be realized, three problems need to be solved.

The Big Data problem
As we connect more devices and systems in our plants to the Internet and build out the IoT, a tremendous amount of data will be generated and transmitted—terabytes of data per second. These are volumes of data the digital world has never seen before. This is the Big Data problem.

Moving that much data onto existing network and Internet infrastructures for cloud-based analytics and centralized management will dramatically increase Internet latency. For many industrial IoT applications, latency is not acceptable because real-time control and monitoring are mandatory.

For IoT to reach critical mass, intelligence must be pushed to the edge of the network. The network edge is where physical assets (things) such as sensors, actuators and circuits are connected to the network. Fog computing brings computation capabilities closer to the network edge to filter or process data and send only required data to the cloud, thereby decreasing traffic and latency on local networks and the Internet.

Fog computing also plays a valuable role in efficiency, security and compliance. Industrial fog computing systems are also responsible for the local process control and automation tasks of traditional industrial applications.

The connectivity problem
At the network edge, OT assets like motors, pumps, relays and meters are attached to industrial equipment and machines. These assets translate what’s happening in the physical world (temperature, light, vibration, sound, motion, flow rate) into electrical signals like voltage and current, which are then interpreted by other systems to monitor and control physical equipment and machines. They were not designed to communicate with the IoT.

Assets like these rarely connect to the Internet, let alone speak or understand the protocols and languages the Internet uses, like Internet Protocol (IP) or RESTful APIs. They don’t have a built-in TCP/IP stack or a web server. And they have little or no built-in computing power for the fog computing required to filter volumes of data before sending it to the cloud.

As a result, the Internet and the things we want to connect to it aren’t communicating. There’s a disconnect between the physical world of currents and voltages and the digital world of data and communications.

The IoT architecture problem
In today’s IoT architecture, for a cloud-based server to capture data from an analog sensor, the sensor’s data must be translated to digital data. In some cases, these sensors are physically wired to controllers such as PLCs.

However, PLC hardware, software and programming languages were designed for repetitive, application-specific tasks like process control and discrete automation. They typically use proprietary protocols and languages for communication and programming, lack information security standards like encryption and authentication, and were originally designed as standalone systems without Internet connectivity in mind. Systems that use Internet-compliant communication protocols such as PCs, web servers and databases require vendor-specific and often proprietary middleware or hardware-based protocol gateways to communicate with a PLC.

OPC software is one solution, but it was designed using the Microsoft Windows-only process exchange called COM and DCOM. Most systems and devices connecting to the IoT, such as sensors, relays, smartphones and web servers, are not Windows-based. For example, Apple and Android smartphones run modified versions of the Linux operating system, where COM/DCOM process exchange does not exist. OPC UA (Unified Architecture) has been released to address this problem, but it typically relies on legacy OPC drivers built on Windows architecture. One option is to embed an OPC UA server into IoT assets, but the fact is that modern web servers, databases, smartphones and tablets don’t natively speak OPC UA.

PLCs, OPC servers, proprietary drivers and protocol gateways quickly become layers of complexity that require time, money and specific domain expertise to install and support. With this approach, data is converted so many times that data integrity and security can be jeopardized, and provisioning and troubleshooting is very difficult.

Multiply these issues across the billions of devices we expect to connect using the IoT and you see the communication challenge the IoT faces.

A better way
For the IoT to reach critical mass, layers of complexity must be removed from the communication process between digital systems and physical assets. Today’s Internet uses a common set of protocols, tools and routines designed to make data transportation, acquisition and analysis a seamless process. We can collect meaningful data from the huge installed base of existing things, but it requires a solution that understands both sides of the OT and IT equation, meaning that it must be able to:

  • Translate the physical world of currents and voltages (OT) into the secure RESTful APIs and JavaScript Object Notation (JSON) frames the digital world (IT) understands.
  • Process and filter mountains of data, sending only pragmatic data to the cloud.
  • Communicate using open Internet protocols such as HTTPS or MQTT over TCP/IP.
  • Provide enough processing power to maintain the closed-loop, real-time control requirements of industrial applications, along with edge computing capabilities.
  • Deliver all of the above in a package suitable for challenging industrial environments where dust, moisture, vibration, electromechanical frequencies and temperature vary widely.

The IoT will fall short of its potential until communication, security and computing technologies of the Internet find their way into computing at the edge. That’s why fog computing can be the sensor on ramp to the IoT.

Fortunately, Internet technologies are available in some industrial systems today. And some vendors have already started bridging the gap between OT and IT by adding IoT technology like HTTPS and RESTful APIs directly into PACs.

Our shortest path to a successful IoT is to leverage the existing interoperability technologies of the Internet in industrial automation products and applications.

Companies in this Article

Sponsored Recommendations

Strategizing for sustainable success in material handling and packaging

Download our visual factory brochure to explore how, together, we can fully optimize your industrial operations for ongoing success in material handling and packaging. As your...

A closer look at modern design considerations for food and beverage

With new and changing safety and hygiene regulations at top of mind, its easy to understand how other crucial aspects of machine design can get pushed aside. Our whitepaper explores...

Fueling the Future of Commercial EV Charging Infrastructure

Miguel Gudino, an Associate Application Engineer at RS, addresses various EV charging challenges and opportunities, ranging from charging station design strategies to the advanced...

Condition Monitoring for Energy and Utilities Assets

Condition monitoring is an essential element of asset management in the energy and utilities industry. The American oil and gas, water and wastewater, and electrical grid sectors...