Browse Definitions :

Browse Definitions by Alphabet

DAT - DEF

  • data privacy (information privacy) - Data privacy, also called information privacy, is an aspect of data protection that addresses the proper storage, access, retention, immutability and security of sensitive data.
  • data profiling - Data profiling refers to the process of examining, analyzing, reviewing and summarizing data sets to gain insight into the quality of data.
  • data protection impact assessment (DPIA) - A data protection impact assessment (DPIA) is a process designed to help organizations determine how data processing systems, procedures or technologies affect individuals’ privacy and eliminate any risks that might violate compliance.
  • data protection management (DPM) - Data protection management (DPM) comprises the administration, monitoring and management of backup processes to ensure backup tasks run on schedule and data is securely backed up and recoverable.
  • data quality - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data recovery - Data recovery restores data that has been lost, accidentally deleted, corrupted or made inaccessible.
  • data recovery agent (DRA) - A data recovery agent (DRA) is a Microsoft Windows user account with the ability to decrypt data that was encrypted by other users.
  • data reduction - Data reduction is the process of reducing the amount of capacity required to store data.
  • data replication - Data replication copies data from one location to another using a SAN, LAN or local WAN.
  • data residency - Data residency refers to the physical or geographic location of an organization's data or information.
  • data restore - Data restore is the process of copying backup data from secondary storage and restoring it to its original location or a new location.
  • data retention policy - A data retention policy, or records retention policy, is an organization's established protocol for retaining information for operational or regulatory compliance needs.
  • data sampling - Data sampling is a statistical analysis technique used to select, manipulate and analyze a representative subset of data points to identify patterns and trends in the larger data set being examined.
  • data science as a service (DSaaS) - Data science as a service (DSaaS) is a form of outsourcing that involves the delivery of information gleaned from advanced analytics applications run by data scientists at an outside company to corporate clients for their business use.
  • data science platform - A data science platform is software that allows data scientists to uncover actionable insights from data and communicate those insights throughout an enterprise within a single environment.
  • data scientist - A data scientist is an analytics professional who is responsible for collecting, analyzing and interpreting data to help drive decision-making in an organization.
  • data set - A data set is a collection of data that contains individual data units organized (formatted) in a specific way and accessed by one or more specific access methods based on the data set organization and data structure.
  • data silo - A data silo exists when an organization's departments and systems cannot, or do not, communicate freely with one another and encourage the sharing of business-relevant data.
  • data source name (DSN) - A data source name (DSN) is a data structure that contains the information about a specific database that an Open Database Connectivity (ODBC) driver needs in order to connect to it.
  • data sovereignty - Data sovereignty is the concept that information which has been converted and stored in binary digital form is subject to the laws of the country in which it is located.
  • data splitting - Data splitting is when data is divided into two or more subsets.
  • data stewardship - Data stewardship is the management and oversight of an organization's data assets to help provide business users with high-quality data that is easily accessible in a consistent manner.
  • data store - A data store is a repository for persistently storing collections of data, such as a database, a file system or a directory.
  • data storytelling - Data storytelling is the process of translating complex data analyses into layman's terms in order to influence a decision or action.
  • data streaming - Data streaming is the continuous transfer of data at a steady, high-speed rate.
  • data structures - A data structure is a specialized format for organizing, processing, retrieving and storing data.
  • Data Transfer Project (DTP) - Data Transfer Project (DTP) is an open source initiative to facilitate customer-controlled data transfers between two online services.
  • data transfer rate (DTR) - Data transfer rate (DTR) is the amount of digital data that is moved from one place to another in a given time.
  • data transformation - Data transformation is the process of converting data from one format, such as a database file, XML document or Excel spreadsheet, into another.
  • data type - A data type, in programming, is a classification that specifies which type of value a variable has and what type of mathematical, relational or logical operations can be applied to it without causing an error.
  • data validation - Data validation is the practice of checking the integrity, accuracy and structure of data before it is used for a business operation.
  • data virtualization - Data virtualization is an umbrella term used to describe any approach to data management that allows an application to retrieve and manipulate data without needing to know any technical details about the data such as how it is formatted or where it is physically located.
  • data visualization - Data visualization is the practice of translating information into a visual context, such as a map or graph, to make data easier for the human brain to understand and pull insights from.
  • data warehouse - A data warehouse is a federated repository for all the data collected by an enterprise's various operational systems, be they physical or logical.
  • data warehouse appliance - A data warehouse appliance is an all-in-one “black box” solution optimized for data warehousing.
  • data warehouse as a service (DWaaS) - Data warehouse as a service (DWaaS) is an outsourcing model in which a cloud service provider configures and manages the hardware and software resources a data warehouse requires, and the customer provides the data and pays for the managed service.
  • data-driven decision management (DDDM) - Data-driven decision management (DDDM) is an approach to business governance that values actions that can be backed up with verifiable data.
  • database (DB) - A database is a collection of information that is organized so that it can be easily accessed, managed and updated.
  • database abstraction layer - A database abstraction layer is a simplified representation of a database in the form of a written description or a diagram.
  • database administrator (DBA) - A database administrator (DBA) is the information technician responsible for directing or performing all activities related to maintaining a successful database environment.
  • database as a service (DBaaS) - Database as a service (DBaaS) is a cloud computing managed service offering that provides access to a database without requiring the setup of physical hardware, the installation of software or the need to configure the database.
  • database automation - Database automation is the use of unattended processes and self-updating procedures for administrative tasks in a database.
  • database availability group (DAG) - A database availability group (DAG) is a high availability (HA) and data recovery feature of Exchange Server 2010.
  • database management system (DBMS) - A database management system (DBMS) is system software for creating and managing databases, allowing end users to create, protect, read, update and delete data in a database.
  • database marketing - Database marketing is a systematic approach to the gathering, consolidation and processing of consumer data.
  • database normalization - Database normalization is intrinsic to most relational database schemes.
  • database replication - Database replication is the frequent electronic copying of data from a database in one computer or server to a database in another -- so that all users share the same level of information.
  • DataBricks - DataBricks is an organization and big data processing platform founded by the creators of Apache Spark.
  • DataCore - DataCore is a software-defined storage (SDS) company, as well as an early storage virtualization software vendor, in Fort Lauderdale, Fla.
  • Datadog - Datadog is a monitoring and analytics tool for information technology (IT) and DevOps teams that can be used to determine performance metrics as well as event monitoring for infrastructure and cloud services.
  • DataOps (data operations) - DataOps (data operations) is an Agile approach to designing, implementing and maintaining a distributed data architecture that will support a wide range of open source tools and frameworks in production.
  • Datto - Datto Inc. is a backup, recovery and business continuity vendor with headquarters in Norwalk, Conn.
  • daughterboard (or daughter board, daughter card, or daughtercard) - A daughterboard (or daughter board, daughter card, or daughtercard) is a circuit board that plugs into and extends the circuitry of another circuit board.
  • Daylight Saving Time (DST) - Daylight Saving Time (DST) is the practice of turning the clock ahead as warmer weather approaches and back as it becomes colder again.
  • days inventory outstanding (DIO) - Days inventory outstanding (DOI) is the average number of days it takes for inventory to be sold.
  • days sales outstanding (DSO) - Days sales outstanding (DSO) is the measurement of the average number of days it takes a business to collect payments after a sale has been made.
  • Db2 - Db2 is a family of database management system (DBMS) products from IBM that serve a number of different operating system (OS) platforms.
  • DC (direct current) - DC (direct current) is the unidirectional flow or movement of electric charge carriers (which are usually electrons).
  • DCE (Distributed Computing Environment) - In network computing, DCE (Distributed Computing Environment) is an industry-standard software technology for setting up and managing computing and data exchange in a system of distributed computers.
  • DCPromo (Domain Controller Promoter) - DCPromo (Domain Controller Promoter) is a tool in Active Directory that installs and removes Active Directory Domain Services and promotes domain controllers.
  • de-anonymization (deanonymization) - De-anonymization is a method used to detect the original data that was subjected to processes to make it impossible -- or at least harder -- to identify the personally identifiable information (PII).
  • dead zone (Wi-Fi dead zone) - A dead zone (Wi-Fi dead zone) is an area within a wireless LAN location where Wi-Fi does not function, typically due to radio interference or range issues.
  • deadlock - A deadlock is a situation in which two computer programs sharing the same resource are effectively preventing each other from accessing the resource, resulting in both programs ceasing to function.
  • deal registration - Deal registration is a common feature of vendors' channel partner programs in which a channel partner, such as a value-added reseller (VAR), informs the vendor about a sales lead.
  • death by PowerPoint - Death by PowerPoint is a phenomenon caused by the poor use of presentation software.
  • Debian - Debian is a popular and freely available computer operating system (OS) that uses a Unix-like kernel -- typically Linux -- alongside other program components, many of which come from GNU Project.
  • debouncing - Bouncing is the tendency of any two metal contacts in an electronic device to generate multiple signals as the contacts close or open; debouncing is any kind of hardware device or software that ensures that only a single signal will be acted upon for a single opening or closing of a contact.
  • debugging - Debugging, in computer programming and engineering, is a multistep process that involves identifying a problem, isolating the source of the problem and then either correcting the problem or determining a way to work around it.
  • deception technology - Deception technology is a class of security tools and techniques designed to prevent an attacker who has already entered the network from doing damage.
  • decibel - In electronics and communications, the decibel (abbreviated as dB, and also as db and DB) is a logarithmic expression of the ratio between two signal power, voltage, or current levels.
  • decibels relative to carrier (dBc) - dBc (decibels relative to carrier) is a measure of the strength of an instantaneous signal at radio frequency.
  • decibels relative to isotropic radiator (dBi) - The expression dBi is used to define the gain of an antenna system relative to an isotropic radiator at radio frequencies.
  • decibels relative to one millivolt (dBmV) - dBmV (decibels relative to one millivolt) is a measure of the signal strength in wires and cables at RF and AF frequencies.
  • decibels relative to one milliwatt (dBm) - The expression dBm is used to define signal strength in wires and cables at RF and AF frequencies.
  • decibels relative to reference level (dBr) - The expression dBr is used to define signal strength at RF and AF frequencies.
  • decimal - Decimal is a term that describes the base-10 number system, probably the most commonly used number system.
  • decision support system (DSS) - A decision support system (DSS) is a computer program application used to improve a company's decision-making capabilities.
  • decision tree - A decision tree is a graph that uses a branching method to illustrate every possible output for a specific input.
  • decision-making process - The decision-making process, in a business context, is a set of steps taken by managers in an enterprise to determine the planned path for business initiatives and to set specific actions in motion.
  • declarative programming - Declarative programming is a method to abstract away the control flow for logic required for software to perform an action, and instead involves stating what the task or desired outcome is.
  • decompile - To decompile means to convert executable or ready-to-run program code -- sometimes called object code -- into some form of higher-level programming language that humans can easily understand.
  • decompression bomb (zip bomb, zip of death attack) - A decompression bomb -- also known as a zip bomb or zip of death attack -- is a malicious archive file containing a large amount of compressed data.
  • deconvolutional networks (deconvolutional neural networks) - Deconvolutional networks are convolutional neural networks (CNN) that work in a reversed process.
  • decoupled architecture - In general, a decoupled architecture is a framework for complex work that allows components to remain completely autonomous and unaware of each other.
  • dedicated cloud - A dedicated cloud is a single-tenant cloud infrastructure, which essentially acts as an isolated, single-tenant public cloud.
  • dedicated line - A dedicated line is a telecommunications path between two points that is available 24 hours a day for use by a designated user (individual or company).
  • dedicated short-range communication (DSRC) - Dedicated short-range communication (DSRC) is a wireless communication technology designed to allow automobiles in the intelligent transportation system (ITS) to communicate with other automobiles or infrastructure technology.
  • deductive argument - A deductive argument is a logic construct with two or more premises and a conclusion where if the premises are true then the conclusion must also be true.
  • deductive reasoning - Deductive reasoning is a logical process in which a conclusion is based on the accordance of multiple premises that are generally assumed to be true.
  • deep analytics - Deep analytics is the application of sophisticated data processing techniques to yield information from large and typically multi-source data sets comprised of both unstructured and semi-structured data.
  • deep learning - Deep learning is a type of machine learning and artificial intelligence (AI) that imitates the way humans gain certain types of knowledge.
  • deep link - A deep link is a hypertext link to a page on a Web site other than its home page.
  • deep packet inspection (DPI) - Deep packet inspection (DPI) is an advanced method of examining and managing network traffic.
  • deep web - The deep web is an umbrella term for parts of the internet not fully accessible through standard search engines like Google, Bing and Yahoo.
  • deepfake AI (deep fake) - Deep fake (also spelled deepfake) is a type of artificial intelligence used to create convincing images, audio and video hoaxes.
  • DeepMind - DeepMind is a division of Alphabet, Inc.
  • default - In computer technology, a default (noun, pronounced dee-FAWLT) is a predesigned value or setting that is used by a computer program when a value or setting is not specified by the program user.
  • default password - A default password is a standard preconfigured password for a device or software.
  • Defense Acquisition Regulatory Council (DARC) - The Defense Acquisition Regulatory Council (DARC) is a group composed of representatives from each Military department, the Defense Logistics Agency, and the National Aeronautics and Space Administration.
  • Defense Contract Management Agency (DCMA) - The Defense Contract Management Agency (DCMA) is a component of the United States Department of Defense (DoD) that works with defense contractors to ensure government services and supplies are delivered on time, come at the expected cost and satisfy all performance requirements.
Networking
  • network traffic

    Network traffic is the amount of data that moves across a network during any given time.

  • dynamic and static

    In general, dynamic means 'energetic, capable of action and/or change, or forceful,' while static means 'stationary or fixed.'

  • MAC address (media access control address)

    A MAC address (media access control address) is a 12-digit hexadecimal number assigned to each device connected to the network.

Security
  • Evil Corp

    Evil Corp is an international cybercrime network that uses malicious software to steal money from victims' bank accounts and to ...

  • Trojan horse

    In computing, a Trojan horse is a program downloaded and installed on a computer that appears harmless, but is, in fact, ...

  • quantum key distribution (QKD)

    Quantum key distribution (QKD) is a secure communication method for exchanging encryption keys only known between shared parties.

CIO
  • green IT (green information technology)

    Green IT (green information technology) is the practice of creating and using environmentally sustainable computing.

  • benchmark

    A benchmark is a standard or point of reference people can use to measure something else.

  • spatial computing

    Spatial computing broadly characterizes the processes and tools used to capture, process and interact with 3D data.

HRSoftware
  • learning experience platform (LXP)

    A learning experience platform (LXP) is an AI-driven peer learning experience platform delivered using software as a service (...

  • talent acquisition

    Talent acquisition is the strategic process employers use to analyze their long-term talent needs in the context of business ...

  • employee retention

    Employee retention is the organizational goal of keeping productive and talented workers and reducing turnover by fostering a ...

Customer Experience
  • BOPIS (buy online, pick up in-store)

    BOPIS (buy online, pick up in-store) is a business model that allows consumers to shop and place orders online and then pick up ...

  • real-time analytics

    Real-time analytics is the use of data and related resources for analysis as soon as it enters the system.

  • database marketing

    Database marketing is a systematic approach to the gathering, consolidation and processing of consumer data.

Close