Browse Definitions :

Data and data management

Terms related to data, including definitions about data warehousing and words and phrases about data management.
  • 3 V's (volume, velocity and variety) - The 3 V's (volume, velocity and variety) are three defining properties or dimensions of big data.
  • 3-tier application architecture - A 3-tier application architecture is a modular client-server architecture that consists of a presentation tier, an application tier and a data tier.
  • 5V's of big data - The 5 V's of big data -- velocity, volume, value, variety and veracity -- are the five main and innate characteristics of big data.
  • 99.999 (Five nines or Five 9s) - In computers, 99.
  • ACID (atomicity, consistency, isolation, and durability) - In transaction processing, ACID (atomicity, consistency, isolation, and durability) is an acronym and mnemonic device used to refer to the four essential properties a transaction should possess to ensure the integrity and reliability of the data involved in the transaction.
  • actionable intelligence - Actionable intelligence is information that can be immediately used or acted upon -- either tactically in direct response to an evolving situation, or strategically as the result of an analysis or assessment.
  • address space - Address space is the amount of memory allocated for all possible addresses for a computational entity -- for example, a device, a file, a server or a networked computer.
  • Allscripts - Allscripts is a vendor of electronic health record systems for physician practices, hospitals and healthcare systems.
  • alternate data stream (ADS) - An alternate data stream (ADS) is a feature of Windows New Technology File System (NTFS) that contains metadata for locating a specific file by author or title.
  • Amazon Simple Storage Service (Amazon S3) - Amazon Simple Storage Service (Amazon S3) is a scalable, high-speed, web-based cloud storage service.
  • Anaplan - Anaplan is a web-based enterprise platform for business planning.
  • anomaly detection - Anomaly detection is the process of identifying data points, entities or events that fall outside the normal range.
  • Apache Solr - Apache Solr is an open source search platform built upon a Java library called Lucene.
  • API endpoint - An API endpoint is a point at which an API -- the code that allows two software programs to communicate with each other -- connects with the software program.
  • Apple User Enrollment - Apple User Enrollment (UE) is a form of mobile device management (MDM) for Apple products that supports iOS 13 and macOS Catalina.
  • atomic data - In a data warehouse, atomic data is the lowest level of detail.
  • availability bias - In psychology, the availability bias is the human tendency to rely on information that comes readily to mind when evaluating situations or making decisions.
  • Azure Data Studio (formerly SQL Operations Studio) - Azure Data Studio is a Microsoft tool, originally named SQL Operations Studio, for managing SQL Server databases and cloud-based Azure SQL Database and Azure SQL Data Warehouse systems.
  • big data - Big data is a combination of structured, semistructured and unstructured data collected by organizations that can be mined for information and used in machine learning projects, predictive modeling and other advanced analytics applications.
  • big data analytics - Big data analytics is the often complex process of examining big data to uncover information -- such as hidden patterns, correlations, market trends and customer preferences -- that can help organizations make informed business decisions.
  • big data as a service (BDaaS) - Big data as a service (BDaS) is the delivery of data platforms and tools by a cloud provider to help organizations process, manage and analyze large data sets so they can generate insights to improve business operations and gain a competitive advantage.
  • big data engineer - A big data engineer is an information technology (IT) professional who is responsible for designing, building, testing and maintaining complex data processing systems that work with large data sets.
  • big data management - Big data management is the organization, administration and governance of large volumes of both structured and unstructured data.
  • big data storage - Big data storage is a compute-and-storage architecture that collects and manages large data sets and enables real-time data analytics.
  • bit rot - Bit rot is the slow deterioration in the performance and integrity of data stored on storage media.
  • block diagram - A block diagram is a visual representation of a system that uses simple, labeled blocks that represent single or multiple items, entities or concepts, connected by lines to show relationships between them.
  • blockchain storage - Blockchain storage is a way of saving data in a decentralized network, which utilizes the unused hard disk space of users across the world to store files.
  • box plot - A box plot is a graphical rendition of statistical data based on the minimum, first quartile, median, third quartile, and maximum.
  • brontobyte - A brontobyte is an unofficial measure of memory or data storage that is equal to 10 to the 27th power of bytes.
  • business intelligence dashboard - A business intelligence dashboard, or BI dashboard, is a data visualization and analysis tool that displays on one screen the status of key performance indicators (KPIs) and other important business metrics and data points for an organization, department, team or process.
  • capacity management - Capacity management is the broad term describing a variety of IT monitoring, administration and planning actions that ensure that a computing infrastructure has adequate resources to handle current data processing requirements, as well as the capacity to accommodate future loads.
  • chief data officer (CDO) - A chief data officer (CDO) in many organizations is a C-level executive whose position has evolved into a range of strategic data management responsibilities related to the business to derive maximum value from the data available to the enterprise.
  • CICS (Customer Information Control System) - CICS (Customer Information Control System) is middleware that sits between the z/OS IBM mainframe operating system and business applications.
  • clickstream data (clickstream analytics) - Clickstream data and clickstream analytics are the processes involved in collecting, analyzing and reporting aggregate data about which pages a website visitor visits -- and in what order.
  • clinical decision support system (CDSS) - A clinical decision support system (CDSS) is an application that analyzes data to help healthcare providers make decisions and improve patient care.
  • cloud audit - A cloud audit is an assessment of a cloud computing environment and its services, based on a specific set of controls and best practices.
  • Cloud Data Management Interface (CDMI) - The Cloud Data Management Interface (CDMI) is an international standard that defines a functional interface that applications use to create, retrieve, update and delete data elements from cloud storage.
  • cloud SLA (cloud service-level agreement) - A cloud SLA (cloud service-level agreement) is an agreement between a cloud service provider and a customer that ensures a minimum level of service is maintained.
  • cloud storage - Cloud storage is a service model in which data is transmitted and stored on remote storage systems, where it is maintained, managed, backed up and made available to users over a network (typically the internet).
  • cloud storage API - A cloud storage API is an application programming interface that connects a locally based application to a cloud-based storage system so that a user can send data to it and access and work with data stored in it.
  • cloud storage service - A cloud storage service is a business that maintains and manages its customers' data and makes that data accessible over a network, usually the internet.
  • cluster quorum disk - A cluster quorum disk is the storage medium on which the configuration database is stored for a cluster computing network.
  • cold backup (offline backup) - A cold backup is a backup of an offline database.
  • complex event processing (CEP) - Complex event processing (CEP) is the use of technology to predict high-level events.
  • compliance as a service (CaaS) - Compliance as a service (CaaS) is a cloud service that specifies how a managed service provider (MSP) helps an organization meet its regulatory compliance mandates.
  • conflict-free replicated data type (CRDT) - A conflict-free replicated data type (CRDT) is a data structure that lets multiple people or applications make changes to the same piece of data.
  • conformed dimension - In data warehousing, a conformed dimension is a dimension that has the same meaning to every fact with which it relates.
  • Consensus Algorithm - A consensus algorithm is a process in computer science used to achieve agreement on a single data value among distributed processes or systems.
  • consumer data - Consumer data is the information that organizations collect from individuals who use internet-connected platforms, including websites, social media networks, mobile apps, text messaging apps or email systems.
  • containers (container-based virtualization or containerization) - Containers are a type of software that can virtually package and isolate applications for deployment.
  • content personalization - Content personalization is a branding and marketing strategy in which webpages, email and other forms of content are tailored to match the characteristics, preferences or behaviors of individual users.
  • Continuity of Care Document (CCD) - A Continuity of Care Document (CCD) is an electronic, patient-specific document detailing a patient's medical history.
  • Continuity of Care Record (CCR) - The Continuity of Care Record, or CCR, provides a standardized way to create electronic snapshots about a patient's health information.
  • core banking system - A core banking system is the software that banks use to manage their most critical processes, such as customer accounts, transactions and risk management.
  • correlation - Correlation is a statistical measure that indicates the extent to which two or more variables fluctuate in relation to each other.
  • correlation coefficient - A correlation coefficient is a statistical measure of the degree to which changes to the value of one variable predict change to the value of another.
  • CRM (customer relationship management) analytics - CRM (customer relationship management) analytics comprises all of the programming that analyzes data about customers and presents it to an organization to help facilitate and streamline better business decisions.
  • CRUD cycle (Create, Read, Update and Delete Cycle) - The CRUD cycle describes the elemental functions of a persistent database in a computer.
  • cryptographic nonce - A nonce is a random or semi-random number that is generated for a specific use.
  • curation - Curation is a field of endeavor involved with assembling, managing and presenting some type of collection.
  • Current Procedural Terminology (CPT) code - Current Procedural Terminology (CPT) is a medical code set that enables physicians and other healthcare providers to describe and report the medical, surgical, and diagnostic procedures and services they perform to government and private payers, researchers and other interested parties.
  • customer data integration (CDI) - Customer data integration (CDI) is the process of defining, consolidating and managing customer information across an organization's business units and systems to achieve a "single version of the truth" for customer data.
  • customer intelligence (CI) - Customer intelligence (CI) is the process of collecting and analyzing detailed customer data from internal and external sources to gain insights about customer needs, motivations and behaviors.
  • customer segmentation - Customer segmentation is the practice of dividing a customer base into groups of individuals that have similar characteristics relevant to marketing, such as age, gender, interests and spending habits.
  • dark data - Dark data is digital information an organization collects, processes and stores that is not currently being used for business purposes.
  • data - In computing, data is information that has been translated into a form that is efficient for movement or processing.
  • data abstraction - Data abstraction is the reduction of a particular body of data to a simplified representation of the whole.
  • data activation - Data activation is a marketing approach that uses consumer information and data analytics to help companies gain real-time insight into target audience behavior and plan for future marketing initiatives.
  • data aggregation - Data aggregation is any process whereby data is gathered and expressed in a summary form.
  • data analytics (DA) - Data analytics (DA) is the process of examining data sets to find trends and draw conclusions about the information they contain.
  • data architect - A data architect is an IT professional responsible for defining the policies, procedures, models and technologies to be used in collecting, organizing, storing and accessing company information.
  • Data as a Service (DaaS) - Data as a Service (DaaS) is an information provision and distribution model in which data files (including text, images, sounds, and videos) are made available to customers over a network, typically the Internet.
  • data availability - Data availability is a term used by computer storage manufacturers and storage service providers to describe how data should be available at a required level of performance in situations ranging from normal through disastrous.
  • data breach - A data breach is a cyber attack in which sensitive, confidential or otherwise protected data has been accessed or disclosed in an unauthorized fashion.
  • data catalog - A data catalog is a software application that creates an inventory of an organization's data assets to help data professionals and business users find relevant data for analytics uses.
  • data center chiller - A data center chiller is a cooling system used in a data center to remove heat from one element and deposit it into another element.
  • data center services - Data center services is a collective term for all the supporting components necessary to the proper operation of data center.
  • data citizen - A data citizen is an employee who relies on data to make decisions and perform job responsibilities.
  • data classification - Data classification is the process of organizing data into categories that make it is easy to retrieve, sort and store for future use.
  • data clean room - A data clean room is a technology service that helps content platforms keep first person user data private when interacting with advertising providers.
  • data cleansing (data cleaning, data scrubbing) - Data cleansing, also referred to as data cleaning or data scrubbing, is the process of fixing incorrect, incomplete, duplicate or otherwise erroneous data in a data set.
  • data collection - Data collection is the process of gathering data for use in business decision-making, strategic planning, research and other purposes.
  • data curation - Data curation is the process of creating, organizing and maintaining data sets so they can be accessed and used by people looking for information.
  • data democratization - Data democratization is the ability for information in a digital format to be accessible to the average end user.
  • data destruction - Data destruction is the process of destroying data stored on tapes, hard disks and other forms of electronic media so that it is completely unreadable and cannot be accessed or used for unauthorized purposes.
  • data dignity - Data dignity, also known as data as labor, is a theory positing that people should be compensated for the data they have created.
  • Data Dredging (data fishing) - Data dredging -- sometimes referred to as data fishing -- is a data mining practice in which large data volumes are analyzed to find any possible relationships between them.
  • data engineer - A data engineer is an IT professional whose primary job is to prepare data for analytical or operational uses.
  • data exploration - Data exploration is the first step in data analysis involving the use of data visualization tools and statistical techniques to uncover data set characteristics and initial patterns.
  • data feed - A data feed is an ongoing stream of structured data that provides users with updates of current information from one or more sources.
  • data flow diagram (DFD) - A data flow diagram (DFD) is a graphical or visual representation using a standardized set of symbols and notations to describe a business's operations through data movement.
  • data governance policy - A data governance policy is a documented set of guidelines for ensuring that an organization's data and information assets are managed consistently and used properly.
  • data gravity - Data gravity is an attribute of data that is manifest in the way software and services are drawn to it relative to its mass (the amount of data).
  • data historian - A data historian is a software program that records the data created by processes running in a computer system.
  • data in motion - Data in motion, also referred to as data in transit or data in flight, is a process in which digital information is transported between locations either within or between computer systems.
  • data in use - Data in use is data that is currently being updated, processed, accessed and read by a system.
  • data ingestion - Data ingestion is the process of obtaining and importing data for immediate use or storage in a database.
  • data integration - Data integration is the process of combining data from multiple source systems to create unified sets of information for both operational and analytical uses.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data labeling - Data labeling is the process of identifying and tagging data samples commonly used in the context of training machine learning (ML) models.
Networking
  • firewall as a service (FWaaS)

    Firewall as a service (FWaaS), also known as a cloud firewall, is a service that provides cloud-based network traffic analysis ...

  • private 5G

    Private 5G is a wireless network technology that delivers 5G cellular connectivity for private network use cases.

  • NFVi (network functions virtualization infrastructure)

    NFVi (network functions virtualization infrastructure) encompasses all of the networking hardware and software needed to support ...

Security
  • virus (computer virus)

    A computer virus is a type of malware that attaches itself to a program or file. A virus can replicate and spread across an ...

  • Certified Information Security Manager (CISM)

    Certified Information Security Manager (CISM) is an advanced certification that indicates that an individual possesses the ...

  • cryptography

    Cryptography is a method of protecting information and communications using codes, so that only those for whom the information is...

CIO
  • B2B (business to business)

    B2B (business-to-business) is a type of commerce involving the exchange of products, services or information between businesses, ...

  • return on investment (ROI)

    Return on investment (ROI) is a crucial financial metric investors and businesses use to evaluate an investment's efficiency or ...

  • big data as a service (BDaaS)

    Big data as a service (BDaS) is the delivery of data platforms and tools by a cloud provider to help organizations process, ...

HRSoftware
  • talent acquisition

    Talent acquisition is the strategic process an organization uses to identify, recruit and hire the people it needs to achieve its...

  • human capital management (HCM)

    Human capital management (HCM) is a comprehensive set of practices and tools used for recruiting, managing and developing ...

  • Betterworks

    Betterworks is performance management software that helps workforces and organizations to improve manager effectiveness and ...

Customer Experience
  • martech (marketing technology)

    Martech (marketing technology) refers to the integration of software tools, platforms, and applications designed to streamline ...

  • transactional marketing

    Transactional marketing is a business strategy that focuses on single, point-of-sale transactions.

  • customer profiling

    Customer profiling is the detailed and systematic process of constructing a clear portrait of a company's ideal customer by ...

Close