Historians were developed to perform operational trending, Middleton says. On the plant floor, data may come from programmable logic controllers or supervisory-control-and-data-acquisition (SCADA) systems. “Then those data were used locally by SCADA to make deterministic decisions, or sent to higher-level system for analytics,” Wilkins adds.
What kinds of data do historians manage? “You name it,” Wilkins replies. “Lots of different data types. That is generally true for all historian manufacturers.”
Traditionally, those data are deposited into historians through the pull model. “The historian has to know where to get data. Then it then extracts and historizes (chronicles) it,” Middleton explains. But through the newer push model, there is time integration between SCADA systems and the historian, he states. “The primary system knows the historian needs the data, configures that historian and then pushes the data into it.” Wilkins notes that transfer time from data collectors, such as data-acquisition devices, can be milliseconds.
Data retrieval may depend on the end-user preference or the technology’s vendor. For example, while he says that multiple data-retrieval technologies may be used, Wilkins believes that Microsoft Corp.’s Excel is popular. “We find that manufacturing engineers have comfort with that.” He notes, though, that he sees more migration to Excel Services, which are a part of Microsoft Office SharePoint technology and, “actually, more to Web-based analyses.”
But Wonderware “significantly leverages” the Microsoft structured-query-language (SQL) server, which is a relational database management system, Middleton explains. “We expose our data as if it’s in an SQL database—but the data are stored in proprietary files, as most vendors do—for performance and scalability.”
However, there’s a “lot of overhead in storing data in an SQL server,” Middleton remarks. That’s where filtering and compression of data find use. Filtering removes random noise, not the original signal; compression simply removes space. That compression functionality of data historians is important, Wilkins stresses, because higher consumption of repository space “will ultimately lead to performance issues.” For most historians, he believes that the compression ratio “would be 100-to-1 or better. But that depends on historian configuration.”
One functionality historians may leverage is graphical visualization of data, something Wilkins says historians may provide as a basic administrative tool. “But for more in-depth analyses, users may have to use vendor-specific portals or Microsoft Excel.” Middleton, who believes it’s important to balance responsiveness with fidelity, adds that “what may be important is being able to overlay time periods.”
And yet another functionality that will ensure historians won’t go by the wayside? What Wilkins calls a next-level capability, consisting of mathematical operations that manipulate data and convert them into actionable information. Adding that “there is no singular use for historians,” he stresses that whatever their use, the key is “to deliver high-fidelity, easily accessible data at high rates of speed.”
C. Kenna Amos, firstname.lastname@example.org, is an Automation World Contributing Editor.