Virtualization Reduces Costs, Risk for Power Company

Sept. 2, 2016
After realizing the benefits of virtualization, a best practice commonly used in the IT field, Tacoma Power successfully applied virtualization to its HMI/SCADA systems at its hydroelectric plants and fish hatcheries. It was well worth the effort.

At many organizations, the automation team is responsible for the automation of industrial control systems (ICSs), including IT-related challenges such as computer systems and networks. Software and hardware system performance, maintenance and upgrades at industrial control facilities are typically a part of automation’s domain. While IT leans on virtualization to ease many of these challenges, this technology is not yet common practice in automation—but awareness of the benefits is increasing.

After researching, testing and piloting, Tacoma Power, which owns and operates hydroelectric plants and fish hatcheries in Tacoma, Wash., was able to virtualize some of its HMI/SCADA systems, ultimately realizing a slew of benefits that improve reliability and reduce long-term costs.

What virtualization provides
Virtual machines use an operating system (OS) or application environment installed on software to replicate dedicated hardware. Virtualization is common in the IT world, with about 75 percent of x86 architecture workloads virtualized on servers, according to Gartner Group. Although fewer than 30 percent of systems are virtualized in automation and controls, the benefits are the same: Server productivity can increase by 10 times, consolidating workloads from underutilized servers onto a single server. Other key benefits include:

  • Isolation: Fault and security isolation is at the hardware level. Advanced resource controls preserve performance.
  • Encapsulation: The entire state of the virtual machine can be saved to files. Users can move and copy virtual machines as easily as moving and copying files.
  • Hardware independence: Users can provision or migrate any virtual machine to any similar or different physical server.
  • Partitioning: Users can run multiple operating systems on one physical machine, and divide system resources between virtual machines.

Tacoma Power’s virtualization decision
Tacoma Power serves 160,000 electric utility customers with nine hydroelectric facilities on four rivers. The Power Generation Automation team, with three engineers and one engineering technician, is responsible for integrating, modernizing and maintaining ICSs for 23 hydroelectric generators, four fish hatcheries, three fish collection facilities, and nine unmanned hydro facilities.

For more than 10 years, the team has used GE iFix with about 50,000 data points to monitor and control lake levels and process control variables at the generation facilities, as well as flow, temperature, pump status, fish counting, running gates, and other areas at the fish hatcheries.

In addition to visualization and control, Tacoma Power uses iFix for trending, troubleshooting and continuous optimization. Data collection and management is critical for regulatory reporting as well as day-to-day operations. Operators can also access the HMI screens from remote devices to speed response to hatchery-related alarms.

The automation team considered virtualization of the iFix HMI/SCADA system to improve operations and reduce costs, but was concerned about a lack of familiarity with a thin client environment. “Our team is most familiar with a traditional thick client environment, and the industry is still primarily using thick clients,” says Ozan Ferrin, generation automation engineering supervisor for Tacoma Power. “However, we knew that thick clients also had disadvantages.”

Traditional thick client environments are more prone to failure than thin clients—unless users implement industrial computers, which can double hardware costs. Additionally, restoring or upgrading an entire thick client system can take from several hours to several weeks. Patches and other software updates can make thick client environments unreliable. Reinstalling or updating the operating system and HMI software could lead to unpredictable results. A large development space is necessary because each node is a separate piece of hardware with separate software installations. Last but not least, thick client environments are more prone to cybersecurity threats with standard Windows operating systems.

With a virtual environment, organizations are less reliant on physical hardware. Virtual environments can be duplicated and loaded onto any virtual host system, independent of the physical hardware. Long-term costs and risks go down, including reduced hardware costs and failures.

Since thin clients replace the thick clients, no re-imaging is necessary in the event of a failure. “Power companies must be reliable, and our automation systems are key to that reliability,” Ferrin says. “Virtualization makes restoring a system for disaster recovery as simple as loading the entire system image to a virtual host machine. During our decision-making process, the decrease in risk was clear.”

Furthermore, with virtualization, development testing can be handled using snapshots or checkpoints, which allows for testing of patches or other software updates. If any issues occur, the team can reverse changes easily to a previous state. Upon a restart of the host server, all the virtual environments return to their previous state. In general, security updates, malware protections and antivirus update controls are easier to implement.

“Now, we have one central location to manage all software and operating systems,” Ferrin explains. “For all our facilities, our development lab is reduced to a single server, which hosts all OSs. Only the necessary thin clients need to be installed during development for remote access to the virtual machines.”

Duplicating similar systems is as easy as copying and pasting the virtual disk and mounting it to a new virtual system. Additional virtual instances can be created easily if there is a need, such as a dedicated system for fish biologists to remotely access data.

Enhanced support for legacy systems, such as a Windows XP virtual environment, can be loaded onto any virtual host. Virtualization also allows for licensing flexibility because software keys can be used to activate software and can be moved between systems as needed.

Implementation does not come without some upfront cost, however. Virtualization is new to many industrial automation and controls engineers, which means a learning curve for implementation and maintenance. New hires need to have an additional knowledge base beyond traditional systems or need to be trained.

“Virtualization can be intimidating,” Ferrin concedes, “but it isn’t as hard as automation engineers initially perceive. They can think of it as a learning opportunity. As virtualization continues to grow in our industry, their knowledge and experience increases their professional value.”

The cost of initial setup can be higher than traditional systems as well if a virtual environment requires a more robust server setup. However, according to Ferrin, the long-term decrease in costs and risks far outweighs the initial setup investment.

Moving forward with virtualization
To make the business case for virtualization, Ferrin and his team reviewed the costs and benefits with Tacoma Power management. The justification was clear—the team proved a small upfront investment with savings in the long run through speed, uptime and significant risk reduction.

With approval and support to proceed, Tacoma Power added virtualization to its regular budget for system lifecycle replacement, tapped an IT consultant for some expert advice, and planned its deployment process.

The team implemented the virtualization environment in a phased approach. Starting in a development environment, Tacoma Power created a new virtual machine, which could be viewed on two screens with visibility on a terminal. In phase two, the team installed GE iFix with soft licensing in the virtual environment. Phase three involved pilot deployment at the first generating plant. The team incorporated the virtual deployment into the plant’s regular hardware/software upgrade schedule, which eliminated any possibility of extra disruption, downtime or cost.

“With the success of the pilot, we refined our best practices and developed standard operating procedures for deployment,” Ferrin says. “Today, we’re continuing to deploy virtualization across all of our plants as part of the regular hardware/software upgrade schedules.”

Leading the power industry with virtualization, Ferrin expects to complete migration of all power generation automation systems within five years. With the current implementations under its belt, the team has seen greater reliability, for example, from no fan or power supply failures. If for some reason a thin client were to fail, Tacoma Power would not lose any data. Additionally, the team has been able to combine other applications into the virtual system for use by other peer groups.

“Virtualization is achievable and worth the effort,” Ferrin contends. “Automation and controls engineers should learn, explore and network with experienced technology professionals for experienced information. With the right expertise, you can overcome any challenges with virtualization, decrease long-term costs, and reduce risks. At this point, I can’t imagine not having our HMI/SCADA in a virtual environment.”

Companies in this Article

Sponsored Recommendations

Wireless Data Acquisition System Case Studies

Wireless data acquisition systems are vital elements of connected factories, collecting data that allows operators to remotely access and visualize equipment and process information...

Strategizing for sustainable success in material handling and packaging

Download our visual factory brochure to explore how, together, we can fully optimize your industrial operations for ongoing success in material handling and packaging. As your...

A closer look at modern design considerations for food and beverage

With new and changing safety and hygiene regulations at top of mind, its easy to understand how other crucial aspects of machine design can get pushed aside. Our whitepaper explores...

Fueling the Future of Commercial EV Charging Infrastructure

Miguel Gudino, an Associate Application Engineer at RS, addresses various EV charging challenges and opportunities, ranging from charging station design strategies to the advanced...