A Virtual Transformation for Data Centers

Nov. 25, 2015
Like commercial IT centers before them, the computing power behind today’s factories and processing plants is embracing virtualization as security and cloud computing gain importance.

Industrial data centers are increasingly being transformed into flexible operations that provide companies with the versatility needed to meet the changing demands of automated industrial sites. These data centers are following the trends set in commercial IT centers, making virtualization a mainstream technology while focusing on security.

Cloud computing is also being tested as corporations large and small attempt to develop strategies that let them meet current demands while building for the future. That’s no small challenge, given the relatively rapid changes in commercial computing and the far more conservative pace in automation systems.

“IT staffs expect lifecycles of three to five years, while industrial systems typically run 10 to 15 to 20 years,” says Christopher Di Biase, senior consultant for network and security services at Rockwell Automation. “By virtualizing, you can pull your old apps onto new hardware. If IT goes to Windows 10, you can pull programs forward and gain a lot of time. Most industrial people want to run a program 10 or 11 years before they think about changing anything.”

For many companies, the move to virtualization comes when they’re getting ready to update. Often, industrial data centers grow somewhat randomly when new equipment is added to provide a new function. But eventually, as global competitiveness rises, maximizing the efficiency of the factory floor infrastructure becomes more important.


Virtues of virtualization

Companies have a choice between leveraging their existing corporate data centers or establishing operations that are dedicated to the industrial group’s computing needs. Most of the time, companies that opt for the latter approach are using virtualization, deploying smaller versions of configurations that have proven effective in recent years for many IT departments.

“People are no longer asking whether they should virtualize, but how they should do it,” Di Biase says. “One big part of that is that automation facilities are starting to build true data centers instead of cobbling equipment together.”

The benefits can be significant. A primary factor is that high-performance servers can run several programs at once. When companies adopt the virtualization model, they can use a few servers to run a broad range of programs instead of adding a new server every time a feature or function is added. It’s easier to add or upgrade software than to add a new system.

“Commissioning will be quicker, usually 15-20 percent faster,” says Douglas Bellin, senior managerof global industries at Cisco Systems. “Uptime will increase and getting back up and running will be faster. On the floor, you’ll need less equipment. You don’t need to have cold spares sitting around.”

Once companies decide that it’s time to update their data centers and move to virtualization, they move boldly, suppliers say. Starting out slowly with plans to expand can be more difficult than jumping in with a strategy supported with equipment that meets or exceeds current requirements.

“Most people who decide to start using virtualization buy what they need—three, four or five servers—instead of buying one or two and expanding later,” Di Biase says. “They’ve learned. A lot of people struggled with Ethernet when they tried to do a little at a time instead of making a complete transition.”

Data storage is an essential element. In many industries, it’s vital for documenting each step in the manufacturing process; in others, it’s critical for keeping track of parameters that improve efficiency and set maintenance schedules. Many strategies now link storage and computing modules.

“A lot of people are moving to software-defined storage, taking what was a separate storage appliance and attaching it to a server,” Bellin says. “When the storage array is part of the server, the hardware architecture is simplified. Storage and computing power can be scaled together.”

Many requirements

Speed and bandwidth requirements are being pushed to new heights with the proliferation of IP-enabled devices, from phones to security cameras. Today, data centers are receiving input from handheld devices that provide inventory control and machines that provide diagnostic data while also processing input from energy management systems and security cameras, to name a few. Often, that means it’s time to upgrade the computing architecture.

Data centers need to have the right access control in place, not just on the room itself, but also at the cabinet level. Source: Belden

“These changes require a data center to be more dynamic and flat in architecture, for example reducing the number of routers and switches, or using network hubs rather than switches to connect devices, in order to keep latency to a minimum and increase speed,” says Mike Peterson, Belden’s product line manager for data centers. “Doing this requires scalable infrastructure and systems that can grow over time.”

However, while networking demands are soaring, storage isn’t expected to follow suit. The megabytes of data generated by much equipment are minimal in an era of terabyte disk drives.

“There will be more data moving across the network, but I don’t see an explosion in data storage requirements,” Bellin says. “There will be smaller increases in data; all data points can be accessed, stored and analyzed.”

To serve and protect

When data centers handle more of the company’s vital information, protecting it will become more important. Though cybersecurity is getting a lot of attention, physical security and personnel training are also critical elements.

“Within the data center, you need to ensure that you have the right access control in place—not just on the room itself, but also at the cabinet level,” Peterson says. “When there is an issue with the factory network, you might have a combination of factory control engineers, IT engineers and facilities personnel needing access to the space. This increases the potential for human error and puts your enterprise and industrial networks at risk.”

Whether corporations alter their data centers or not, they’re going to have to take steps to improve security. Any vulnerabilities in the industrial operations make it possible for attackers to pirate formulas or view production data. Sophisticated extortionists can also hold companies hostage if they can alter operating parameters.

Most industrial networks are, at some point, linked to corporate architectures, so problems on one side can flow into the other. That poses many design challenges for networking developers. They must create industrial links that support the never-ending reconfiguration of manufacturing cells and upgrading of equipment on the factory floor while improving security.

“When the factory floor is connected to the same network as your enterprise, you need to implement security and access methodologies that work for both,” Peterson says. “There is also a need to keep traffic separate from enterprise and manufacturing—or at least managed. On top of all that, your data center needs to be flexible to handle the growth and changes occurring in the manufacturing environment.”

The tight links between IT and industrial groups was highlighted recently when Honeywell partnered with Intel Security so that Intel’s McAfee technologies could augment the offering within Honeywell’s Industrial Cyber Security Solutions. Intel’s Enterprise Security Manager and Next Generation Firewall are among the tools that will help industrial groups protect networks and keep security features up to date. Constant updating has become a critical concern throughout the industrial environment.

“When companies use virtualization, it’s easier to keep up with patches from Microsoft, Linux or wherever,” Bellin says. “That can also help them implement security, since they’re protecting a single box, not multiple boxes.”

The number of pathways into facilities is increasing as more companies let employees use tablets and smartphones. These handhelds give operators more freedom, but they also bring the potential for hackers or disgruntled employees to steal corporate secrets. Companies can limit their exposure by only allowing these devices to display data.

“If you put apps on tablets or smartphones, it’s hard to prevent data from leaving the four walls,” Bellin says. “If you virtualize, data only exists on the server, so you can keep it from leaving the facility. The tablet or phone only serves as a display.”

Cloudy outlook

While data centers evolve to take advantage of the dramatic improvements in computing horsepower, there’s also growing interest in leveraging the cloud. Suppliers are taking various steps to let industrial teams leverage remote computing power and reduce their capital costs and maintenance expenses.

A number of announcements earlier this year highlight industry’s trend toward cloud computing. GE, for example, recently implemented a Global Discovery Server based on Part 12 of the OPC UA standards. This document helps industrial groups link to the Internet and use the cloud. GE’s Proficy Global Discovery Server automatically detects OPC UA devices and connects to them, greatly reducing setup and configuration steps.

KUKA and TTTech are jointly investing in real-time technology platforms and a startup that will focus on real-time cloud computing. The companies are teaming up “in order to seamlessly link cloud computing with our real-time critical control infrastructure,” says Till Reuter, KUKA CEO.

Though the cloud offers many benefits, it’s far from a universal solution. Some information must be handled in a more timely fashion than remote cloud servers can offer. Given the many highly publicized thefts of photos and other information stored on cloud servers, industrial managers are quite concerned about security.

Some large enterprises are gaining some of the benefits of remote computing by setting up their own cloud services. Even so, it’s often wise to limit cloud computing’s role. In many sites, a stratified strategy based on access times could be a viable way to meet the varied needs of industrial data centers.

“Data that needs to be addressed in seconds or less needs to be on the local computers,” Bellin says. “Data for something like predictive maintenance can be pushed onto a local cloud, a private cloud that’s often within the four walls. This data may also be sent to a secure public cloud.”

Though public clouds offer benefits, their role may be limited for a while. Facilities in remote, low-cost areas may not have well-developed infrastructure support. In these locations, it can be far more efficient to set up a dedicated data center.

“Off-premises clouds are not for everyone,” Bellin says. “In rural areas, costs are cheap, but Internet connectivity can be spotty. You can do things like data analytics off site, but the hardware that collects the data should probably be on the premises.”

Work together

Whatever the plan for updating the data center segment of a technology portfolio, cooperation is key. Training, security, data availability and links between industrial and IT systems are just a few of the issues that must be well defined. Many personnel must be involved to create strategies that work efficiently and securely day after day while also defining steps that let teams respond quickly when problems inevitably arise.

“When connecting to a corporate data center, operations, engineering and IT personnel need to work together to understand corporate standards, policies and service level agreements,” Peterson says. “This will make changes on the manufacturing floor quicker and easier.”

Sponsored Recommendations

MSD-SLC16G

CLICK industrial memory card, 16GB microSD. For use with all products with microSD memory card slot.

C0-12DRE-D

CLICK Ethernet Analog PLC, 24 VDC required, Ethernet and serial ports, Discrete Input: 4-point, DC, Analog Input: 2-channel, current/voltage, Discrete Output: 4-point, relay, ...

C2-FILL

CLICK PLUS option slot cover.

USB-CBL-AMICB6

AutomationDirect programming cable, USB A to microB-USB, 6ft cable length. For use with Productivity1000 and Productivity2000 CPUs and most USB devices.