Browse Definitions :

Browse Definitions by Alphabet

D - DAT

  • D - D is one of two programming languages, Digital Mars D, an object-oriented metaprogramming language, or Microsoft D, a key component of the upcoming Oslo development environment.
  • D-channel - In the Integrated Services Digital Network (ISDN), the D-channel is the channel that carries control and signalling information.
  • D2D2C (disk-to-disk-to-cloud) - D2D2C (disk-to-disk-to-cloud) is an approach to hybrid cloud backup that involves using local storage for staging data that will eventually be sent to a third-part cloud storage service provider.
  • d3.js (data-driven documents) - D3.js (data-driven documents) is a library of JavaScript code that enables users to input their own data into prebuilt visualizations.
  • daemon - In computing, a daemon (pronounced DEE-muhn) is a program that runs continuously as a background process and wakes up to handle periodic service requests, which often come from remote processes.
  • daily stand-up meeting - A daily stand-up meeting is a short organizational meeting that is held early each day.
  • dark data - Dark data is digital information an organization collects, processes and stores that is not currently being used for business purposes.
  • dark data center - A dark data center is a facility that is almost exclusively administered remotely, through lights-out management (LOM).
  • dark energy (quintessence) - Dark energy, also called quintessence, is a mysterious energy or force that has been postulated by astronomers and cosmologists to explain recent observations suggesting that the universe is expanding at an ever-increasing rate of speed.
  • dark mode - Dark mode is a color scheme change for user interfaces (UI) on webpages, apps and programs that displays light text on a dark background for easier viewing.
  • dark pattern - Dark patterns are manipulative or deceptive design elements used in some web pages, popups and programs that include malware, freeware, shareware, freemium offerings and even fully paid software.
  • dark post - A dark post is an inexpensive sponsored message on a social media website that is not published to the sponsor page timeline and will not display in follower feeds organically.
  • dark social - Dark social is a term used by marketers and search engine optimization (SEO) specialists to describe website referrals that are difficult to track.
  • dark storage - Dark storage is allocated but unused storage capacity.
  • dark web (darknet) - The dark web, also referred to as the darknet, is an encrypted portion of the internet that is not indexed by search engines and requires specific configuration or authorization to access.
  • DAT (Digital Audio Tape) - DAT (Digital Audio Tape) is a standard medium and technology for the digitalrecording of audioon tape at a professional level of quality.
  • data - In computing, data is information that has been translated into a form that is efficient for movement or processing.
  • data abstraction - Data abstraction is the reduction of a particular body of data to a simplified representation of the whole.
  • Data Access Arrangement (DAA) - A Data Access Arrangement (DAA) is an electronic interface within a computer and its modem to a public telephone line.
  • data access rights - A data access right (DAR) is a permission that has been granted that allows a person or computer program to locate and read digital information at rest.
  • data activation - Data activation is a marketing approach that uses consumer information and data analytics to help companies gain real-time insight into target audience behavior and plan for future marketing initiatives.
  • data aggregation - Data aggregation is any process whereby data is gathered and expressed in a summary form.
  • data analytics (DA) - Data analytics (DA) is the process of examining data sets in order to find trends and draw conclusions about the information they contain.
  • data anonymization - The purpose of data anonymization is to make its source untraceable.
  • data architect - A data architect is an IT professional responsible for defining the policies, procedures, models and technologies to be used in collecting, organizing, storing and accessing company information.
  • data archiving - Data archiving migrates infrequently used data to low-cost, high-capacity archive storage for long-term retention.
  • data artist - A data artist is a business analytics (BA) specialist who creates graphs, charts, infographics and other visual tools that help people understand complex data.
  • Data as a Service (DaaS) - Data as a Service (DaaS) is an information provision and distribution model in which data files (including text, images, sounds, and videos) are made available to customers over a network, typically the Internet.
  • data at rest - Data at rest is a term that is sometimes used to refer to all data in computer storage while excluding data that is traversing a network or temporarily residing in computer memory to be read or updated.
  • data availability - Data availability is a term used by computer storage manufacturers and storage service providers to describe how data should be available at a required level of performance in situations ranging from normal through disastrous.
  • data backup software - Backup software makes a duplicate copy of data to protect it and enable recovery if the data is lost or corrupted due to equipment failure or some other catastrophic event.
  • data binding - Data binding is the process that couples two data sources together and synchronizes them.
  • data breach - A data breach is a cyber attack in which sensitive, confidential or otherwise protected data has been accessed or disclosed in an unauthorized fashion.
  • data broker (information broker) - A data broker, also called an information broker or information reseller, is a business that collects personal information about consumers and sells that information to other organizations.
  • data cap (broadband cap) - A data cap is a specific amount of mobile data that a user account can access for a given amount of money, usually specified per month.
  • data catalog - A data catalog is a software application that creates an inventory of an organization's data assets to help data professionals and business users find relevant data for analytics uses.
  • data center - A data center -- also known as a datacenter or data centre -- is a facility composed of networked computers, storage systems and computing infrastructure that organizations use to organize, process, store and disseminate large amounts of data.
  • data center administrator (DCA) - A data center administrator monitors systems, installs equipment and cabling, and participates in change processes and everyday procedures that support information technology.
  • data center as a service (DCaaS) - A data center as a service (DCaaS) provider will supply turnkey physical data center facilities and computing infrastructure (e.
  • data center bridging (DCB) - DCB is a suite of IEEE standards designed to enable lossless transport over Ethernet and a converged network for all data center applications.
  • data center capacity planning - Data center capacity planning ensures that an IT organization has enough facility space, power and computing resources to support average and peak workloads.
  • Data center career path: Fast Guide - Data centers offer competitive salaries, enjoyable work and diverse opportunities for workers in the tech sector whether you want to become an entry-level data center technician or have the necessary skills to become a data center architect.
  • data center chiller - A data center chiller is a cooling system used in a data center to remove heat from one element and deposit it into another element.
  • data center evaporative cooling (swamp cooling) - Evaporative cooling, also known as swamp cooling, is a strategy for cooling air that takes advantage of the drop in temperature that occurs when water that's exposed to moving air begins to change to gas.
  • data center in a box - A data center in a box, also called a containerized or modular data center,  is a self-contained computing facility that is manufactured in a factory and shipped to a location.
  • Data Center Infrastructure Efficiency (DCiE) - Data Center Infrastructure Efficiency (DCiE) is a metric used to determine the energy efficiency of a data center.
  • data center infrastructure management (DCIM) - Data center infrastructure management (DCIM) is the convergence of IT and building facilities functions within an organization.
  • data center interconnect (DCI) - Data center interconnect (DCI) is a segment of the networking market that focuses on the technology used to link two or more data centers so the facilities can share resources.
  • data center management - Data center management refers to the set of tasks and activities handled by an organization for the day-to-day requirements of operating a data center.
  • data center outsourcing (DCO) - DCO (data center outsourcing) is the practice of outsourcing the day-to-day provisioning and management of computing and storage resources and environments to a third party provider.
  • data center resiliency - Resiliency is the ability of a server, network, storage system or an entire data center to recover quickly and continue operating even when there has been an equipment failure, power outage or other disruption.
  • data center services - Data center services is a collective term for all the supporting components necessary to the proper operation of data center.
  • data citizen - A data citizen is an employee who relies on data to make decisions and perform job responsibilities.
  • data classification - Data classification is the process of organizing data into categories that make it is easy to retrieve, sort and store for future use.
  • data clean room - A data clean room is a technology service that helps content platforms keep first person user data private when interacting with advertising providers.
  • data cleansing (data cleaning, data scrubbing) - Data cleansing, also referred to as data cleaning or data scrubbing, is the process of fixing incorrect, incomplete, duplicate or otherwise erroneous data in a data set.
  • data co-op - A data co-op is a group organized for sharing pooled data from online consumers between two or more companies.
  • data collection - Data collection is the process of gathering data for use in business decision-making, strategic planning, research and other purposes.
  • data compliance - Data compliance is a process that identifies the applicable governance for data protection, security, storage and other activities and establishes policies, procedures and protocols ensuring data is fully protected from unauthorized access and use, malware and other cybersecurity threats.
  • data compression - Data compression is a reduction in the number of bits needed to represent data.
  • data context - Data context is the network of connections among data points.
  • data curation - Data curation is the process of creating, organizing and maintaining data sets so they can be accessed and used by people looking for information.
  • data currency (data as currency) - Data currency is monetary value assigned to data so that it can be used as the unit of exchange in a transaction either as the sole payment or in combination with money.
  • data deduplication - Data deduplication -- often called intelligent compression or single-instance storage -- is a process that eliminates redundant copies of data and reduces storage overhead.
  • data deduplication hardware - Data deduplication hardware is disk storage that eliminates redundant copies of data and retains one instance to be stored.
  • data deduplication ratio - To calculate the deduplication ratio, divide the capacity of backed up data before duplicates are removed by the actual capacity used once the backup is complete.
  • Data Definition Language (DDL) - Data Definition Language (DDL) is used to create and modify the structure of objects in a database using predefined commands and a specific syntax.
  • data democratization - Data democratization is the ability for information in a digital format to be accessible to the average end user.
  • data destruction - Data destruction is the process of destroying data stored on tapes, hard disks and other forms of electronic media so that it is completely unreadable and cannot be accessed or used for unauthorized purposes.
  • data dictionary - A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them.
  • data discovery platform - A data discovery platform is a complete set of tools for the purpose of detecting patterns, and those outlier results outside of patterns, in data.
  • data discrimination (data censorship) - Data discrimination, also called discrimination by algorithm, is bias that occurs when predefined data types or data sources are intentionally or unintentionally treated differently than others.
  • Data Dredging (data fishing) - Data dredging -- sometimes referred to as data fishing -- is a data mining practice in which large data volumes are analyzed to find any possible relationships between them.
  • Data Dynamics StorageX - Data Dynamics StorageX is a software suite that specializes in data migration and Microsoft Distributed File System management.
  • Data Encryption Standard (DES) - Data Encryption Standard (DES) is an outdated symmetric key method of data encryption.
  • data engineer - A data engineer is an IT worker whose primary job is to prepare data for analytical or operational uses.
  • data exfiltration (data extrusion) - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data exhaust - Data exhaust is a byproduct of user actions online and consists of the various files generated by web browsers and their plug-ins such as cookies, log files, temporary internet files and and .
  • data exploration - Data exploration is the first step in data analysis involving the use of data visualization tools and statistical techniques to uncover data set characteristics and initial patterns.
  • data fabric - A data fabric is an architecture and software offering a unified collection of data assets, databases and database architectures within an enterprise.
  • data federation software - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data feed - A data feed is an ongoing stream of structured data that provides users with updates of current information from one or more sources.
  • data flow diagram (DFD) - A data flow diagram (DFD) is a graphical or visual representation using a standardized set of symbols and notations to describe a business's operations through data movement.
  • data glove - A data glove is an interactive device, resembling a glove worn on the hand, which facilitates tactile sensing and fine-motion control in robotics and virtual reality.
  • data governance policy - A data governance policy is a documented set of guidelines for ensuring that an organization's data and information assets are managed consistently and used properly.
  • data gravity - Data gravity is an attribute of data that is manifest in the way software and services are drawn to it relative to its mass (the amount of data).
  • data historian - A data historian is a software program that records the data created by processes running in a computer system.
  • data hygiene - Data hygiene is the collective processes conducted to ensure the cleanliness of data.
  • data in motion - Data in motion, also referred to as data in transit or data in flight, is a process in which digital information is transported between locations either within or between computer systems.
  • data in use - Data in use is data that is currently being updated, processed, accessed and read by a system.
  • data ingestion - Data ingestion is the process of obtaining and importing data for immediate use or storage in a database.
  • data integration - Data integration is the process of combining data from multiple source systems to create unified sets of information for both operational and analytical uses.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data journalism - Data journalism in an approach to writing for the public in which the journalist analyzes large data sets to identify potential news stories.
  • data labeling - Data labeling, in the context of machine learning, is the process of detecting and tagging data samples.
  • data lake - A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed for analytics applications.
  • data lakehouse - A data lakehouse is a data management architecture that combines the benefits of a traditional data warehouse and a data lake.
  • data latency - Data latency is the time it takes for data packets to be stored or retrieved.
  • data life cycle - The data life cycle is the sequence of stages that a particular unit of data goes through from its initial generation or capture to its eventual archival and/or deletion at the end of its useful life.
  • data lifecycle management (DLM) - Data lifecycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its lifecycle: from creation and initial storage to when it becomes obsolete and is deleted.
SearchNetworking
  • network packet

    A network packet is a basic unit of data that's grouped together and transferred over a computer network, typically a ...

  • virtual network functions (VNFs)

    Virtual network functions (VNFs) are virtualized tasks formerly carried out by proprietary, dedicated hardware.

  • network functions virtualization (NFV)

    Network functions virtualization (NFV) is a network architecture model designed to virtualize network services that have ...

SearchSecurity
  • Android System WebView

    Android System WebView is a system component for the Android operating system (OS) that allows Android apps to display web ...

  • data masking

    Data masking is a method of creating a structurally similar but inauthentic version of an organization's data that can be used ...

  • computer worm

    A computer worm is a type of malware whose primary function is to self-replicate and infect other computers while remaining ...

SearchCIO
  • privacy compliance

    Privacy compliance is a company's accordance with established personal information protection guidelines, specifications or ...

  • contingent workforce

    A contingent workforce is a labor pool whose members are hired by an organization on an on-demand basis.

  • product development (new product development -- NPD)

    Product development, also called new product management, is a series of steps that includes the conceptualization, design, ...

SearchHRSoftware
  • talent acquisition

    Talent acquisition is the strategic process employers use to analyze their long-term talent needs in the context of business ...

  • employee retention

    Employee retention is the organizational goal of keeping productive and talented workers and reducing turnover by fostering a ...

  • hybrid work model

    A hybrid work model is a workforce structure that includes employees who work remotely and those who work on site, in a company's...

SearchCustomerExperience
  • Salesforce Trailhead

    Salesforce Trailhead is a series of online tutorials that coach beginner and intermediate developers who need to learn how to ...

  • Salesforce

    Salesforce, Inc. is a cloud computing and social enterprise software-as-a-service (SaaS) provider based in San Francisco.

  • data clean room

    A data clean room is a technology service that helps content platforms keep first person user data private when interacting with ...

Close