Browse Definitions :
Definition

data historian

A data historian is a software program that records the data of processes running in a computer system.

Data historians are commonly used where reliability and uptime are critical. The programs are used to gather information about the operation of programs in order to diagnose failures. Data historians are most common in datacenters and industrial control systems (ICS).

Often, data historians are part of the software in systems used in several processes including:

  • Chemical plants
  • Datacenters
  • Quality control
  • Boiler controls and power plant
  • Nuclear power plants
  • Environmental control
  • Water management
  • Agriculture
  • Food and food processing
  • Automobile manufacturing
  • Pharmaceutical manufacturing
  • Sugar refining plants

The data from numerous sensors, intelligent electronic devices (IEDs), distributed control systems, programmable logic controllers, lab instruments and manually entered data are collected by data historians.

Data historian records might include:

  • Analog data such as CPU temperature, fan and other equipment’s RPMs, flow rates, fluid levels and pressure levels.
  • Digital readings such as valve positions, limit switches, discrete level sensors and whether motors are on or off.
  • Quality assurance data like process, product and custom limits.
  • Alerts such as out of limit and return to normal signals
  • Aggregate data such as average, standard deviation, process capability, moving average

This data is time-stamped and cataloged in an organized, quickly machine-readable format. The data gathered is analyzed to compare the performance of the day and night shifts, different work crews, production runs, material lots and through seasons. Data from data historians is used to answer many performance and efficiency-related questions. Extra insights are gained by visual presentations of the data called data visualization.

This was last updated in September 2017

Continue Reading About data historian

SearchNetworking
  • network packet

    A network packet is a basic unit of data that's grouped together and transferred over a computer network, typically a ...

  • virtual network functions (VNFs)

    Virtual network functions (VNFs) are virtualized tasks formerly carried out by proprietary, dedicated hardware.

  • network functions virtualization (NFV)

    Network functions virtualization (NFV) is a network architecture model designed to virtualize network services that have ...

SearchSecurity
  • Domain-based Message Authentication, Reporting and Conformance (DMARC)

    The Domain-based Message Authentication, Reporting and Conformance (DMARC) protocol is one leg of the tripod of internet ...

  • data breach

    A data breach is a cyber attack in which sensitive, confidential or otherwise protected data has been accessed or disclosed in an...

  • insider threat

    An insider threat is a category of risk posed by those who have access to an organization's physical or digital assets.

SearchCIO
  • data privacy (information privacy)

    Data privacy, also called information privacy, is an aspect of data protection that addresses the proper storage, access, ...

  • leadership skills

    Leadership skills are the strengths and abilities individuals demonstrate that help to oversee processes, guide initiatives and ...

  • data governance policy

    A data governance policy is a documented set of guidelines for ensuring that an organization's data and information assets are ...

SearchHRSoftware
SearchCustomerExperience
  • recommerce

    Recommerce is the selling of previously owned items through online marketplaces to buyers who reuse, recycle or resell them.

  • implementation

    Implementation is the execution or practice of a plan, a method or any design, idea, model, specification, standard or policy for...

  • first call resolution (FCR)

    First call resolution (FCR) is when customer service agents properly address a customer's needs the first time they call.

Close