A data historian is a software program that records the data of processes running in a computer system.
Data historians are commonly used where reliability and uptime are critical. The programs are used to gather information about the operation of programs in order to diagnose failures. Data historians are most common in datacenters and industrial control systems (ICS).
Often, data historians are part of the software in systems used in several processes including:
- Chemical plants
- Quality control
- Boiler controls and power plant
- Nuclear power plants
- Environmental control
- Water management
- Food and food processing
- Automobile manufacturing
- Pharmaceutical manufacturing
- Sugar refining plants
The data from numerous sensors, intelligent electronic devices (IEDs), distributed control systems, programmable logic controllers, lab instruments and manually entered data are collected by data historians.
Data historian records might include:
- Analog data such as CPU temperature, fan and other equipment’s RPMs, flow rates, fluid levels and pressure levels.
- Digital readings such as valve positions, limit switches, discrete level sensors and whether motors are on or off.
- Quality assurance data like process, product and custom limits.
- Alerts such as out of limit and return to normal signals
- Aggregate data such as average, standard deviation, process capability, moving average
This data is time-stamped and cataloged in an organized, quickly machine-readable format. The data gathered is analyzed to compare the performance of the day and night shifts, different work crews, production runs, material lots and through seasons. Data from data historians is used to answer many performance and efficiency-related questions. Extra insights are gained by visual presentations of the data called data visualization.