Browse Definitions :

Database management

Terms related to databases, including definitions about relational databases and words and phrases about database management.

3-T - EXP

  • 3-tier application architecture - A 3-tier application architecture is a modular client-server architecture that consists of a presentation tier, an application tier and a data tier.
  • 99.999 (Five nines or Five 9s) - In computers, 99.
  • access method - In computing, an access method is a program or a hardware mechanism that moves data between the computer and an outlying device such as a hard disk (or other form of storage) or a display terminal.
  • ACID (atomicity, consistency, isolation, and durability) - ACID (atomicity, consistency, isolation, and durability) is an acronym and mnemonic device for learning and remembering the four primary attributes ensured to any transaction by a transaction manager (which is also called a transaction monitor).
  • active directory - Active Directory (AD) is Microsoft's proprietary directory service.
  • ActiveX Data Objects (ADO) - ActiveX Data Objects (ADO) is an application program interface from Microsoft that lets a programmer writing Windows applications get access to a relational or non-relational database from both Microsoft and other database providers.
  • AdventureWorks Database - AdventureWorks Database is a sample OLTP database that Microsoft ships with all of its SQL Server database products.
  • Amazon DynamoDB - Amazon DynamoDB is a fully managed NoSQL database service offered by AWS, designed to provide low latency and high performance for applications.
  • Amazon RDS (Relational Database Service) - Amazon Relational Database Service (RDS) is a managed SQL database service provided by Amazon Web Services (AWS).
  • Amazon Simple Database Service (SimpleDB) - Amazon Simple Database Service (SimpleDB), also known as a key value data store, is a highly available and flexible non-relational database that allows developers to request and store data, with minimal database management and administrative responsibility.
  • Apache Giraph - Apache Giraph is real-time graph processing software that is mostly used to analyze social media data.
  • Apache HBase - Apache HBase is a column-oriented key/value data store built to run on top of the Hadoop Distributed File System (HDFS).
  • Apache Hive - Apache Hive is an open source data warehouse system for querying and analyzing large data sets that are principally stored in Hadoop files.
  • application server - An application server is a server program in a computer in a distributed network that provides the business logic for an application program.
  • archive - An archive is a collection of data moved to a repository for long-term retention, to keep separate for compliance reasons or for moving off primary storage media.
  • artifact (software development) - An artifact is a byproduct of software development that helps describe the architecture, design and function of software.
  • AS1 (Applicability Statement 1) - AS1 (Applicability Statement is a specification for Electronic Data Interchange (EDI) communications between businesses using e-mail protocols.
  • Automated License Plate Recognition (ALPR) - Automated License Plate Recognition (ALPR) is a technology that uses optical character recognition (OCR) to automatically read license plate characters.
  • autonomous transaction - In Oracle's database products, an autonomous transaction is an independent transaction that is initiated by another transaction.
  • Azure Data Studio (formerly SQL Operations Studio) - Azure Data Studio is a Microsoft tool, originally named SQL Operations Studio, for managing SQL Server databases and cloud-based Azure SQL Database and Azure SQL Data Warehouse systems.
  • Basic Assembler Language (BAL) - BAL (Basic Assembler Language) is a version of IBM's assembler language (sometimes called assembly language) for its System/360 and System/370 mainframe operating system.
  • bioinformatics - Bioinformatics is the science of developing computer databases and algorithms for the purpose of speeding up and enhancing biological research.
  • block - A block is a contiguous set of bits or bytes that forms an identifiable unit of data.
  • blockchain - Blockchain is a record-keeping technology designed to make it impossible to hack the system or forge the data stored on it, thereby making it secure and immutable.
  • business rule - A business rule is a statement that describes a business policy or procedure.
  • C++ - C++ is an object-oriented programming (OOP) language that is viewed by many as the best language for creating large-scale applications.
  • Cassandra (Apache Cassandra) - Apache Cassandra is an open source distributed database system that is designed for storing and managing large amounts of data across commodity servers.
  • CICS (Customer Information Control System) - CICS (Customer Information Control System) is middleware that sits between the z/OS IBM mainframe operating system and business applications.
  • cloud database - A cloud database is a collection of informational content, either structured or unstructured, that resides on a private, public or hybrid cloud computing infrastructure platform.
  • cloud database - A cloud database is a collection of informational content, either structured or unstructured, that resides on a private, public or hybrid cloud computing infrastructure platform.
  • cold backup (offline backup) - A cold backup, also called an offline backup, is a database backup during which the database is offline and not accessible to update.
  • column database management system (CDBMS) - There are different types of CDBMS offerings, with the common defining feature being that data is stored by column (or column families) instead of as rows.
  • column-level encryption - Column-level encryption is a method of database encryption in which the information in every cell (or data field) in a particular column has the same password for access, reading, and writing purposes.
  • columnar database - A columnar database is a database management system (DBMS) that stores data in columns instead of rows.
  • concurrent processing - Concurrent processing is a computing model in which multiple processors execute instructions simultaneously for better performance.
  • conformed dimension - In data warehousing, a conformed dimension is a dimension that has the same meaning to every fact with which it relates.
  • cooked data - Cooked data is raw data after it has been processed - that is, extracted, organized, and perhaps analyzed and presented - for further use.
  • correlated subquery - A correlated subquery is a SQL query that depends on values executed by the outer query in order to complete.
  • CouchDB - CouchDB is an open source document-oriented database based on common web standards.
  • CRM analytics - CRM (customer relationship management) analytics comprises all programming that analyzes data about customers and presents it to help facilitate and streamline better business decisions.
  • cryptographic nonce - A nonce is a random or semi-random number that is generated for a specific use.
  • customer data integration (CDI) - Customer data integration (CDI) is the process of defining, consolidating and managing customer information across an organization's business units and systems to achieve a "single version of the truth" for customer data.
  • customer segmentation - Customer segmentation is the practice of dividing a customer base into groups of individuals that are similar in specific ways relevant to marketing, such as age, gender, interests and spending habits.
  • data - In computing, data is information that has been translated into a form that is efficient for movement or processing.
  • data abstraction - Data abstraction is the reduction of a particular body of data to a simplified representation of the whole.
  • data aggregation - Data aggregation is any process whereby data is gathered and expressed in a summary form.
  • data analytics (DA) - Data analytics (DA) is the process of examining data sets in order to find trends and draw conclusions about the information they contain.
  • data availability - Data availability is a term used by computer storage manufacturers and storage service providers to describe how data should be available at a required level of performance in situations ranging from normal through disastrous.
  • data center infrastructure management (DCIM) - Data center infrastructure management (DCIM) is the convergence of IT and building facilities functions within an organization.
  • data cleansing (data cleaning, data scrubbing) - Data cleansing, also referred to as data cleaning or data scrubbing, is the process of fixing incorrect, incomplete, duplicate or otherwise erroneous data in a data set.
  • Data Definition Language (DDL) - Data Definition Language (DDL) is a standard for commands that define the different structures in a database.
  • data dictionary - A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them.
  • data fabric - A data fabric is an architecture and software offering a unified collection of data assets, databases and database architectures within an enterprise.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data management as a service (DMaaS) - Data management as a service (DMaaS) is a type of cloud service that provides enterprises with centralized storage for disparate data sources.
  • data mart (datamart) - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and relationships that can help solve business problems through data analysis.
  • data modeling - Data modeling is the process of creating a simplified diagram of a software system and the data elements it contains, using text and symbols to represent the data and how it flows.
  • data preprocessing - Data preprocessing, a component of data preparation, describes any type of processing performed on raw data to prepare it for another data processing procedure.
  • data profiling - Data profiling refers to the process of examining, analyzing, reviewing and summarizing data sets to gain insight into the quality of data.
  • data quality - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data set - A data set is a collection of data that contains individual data units organized (formatted) in a specific way and accessed by one or more specific access methods based on the data set organization and data structure.
  • data source name (DSN) - A data source name (DSN) is a data structure that contains the information about a specific database that an Open Database Connectivity (ODBC) driver needs in order to connect to it.
  • data splitting - Data splitting is when data is divided into two or more subsets.
  • data store - A data store is a repository for persistently storing collections of data, such as a database, a file system or a directory.
  • data structures - A data structure is a specialized format for organizing, processing, retrieving and storing data.
  • data warehouse - A data warehouse is a federated repository for all the data collected by an enterprise's various operational systems, be they physical or logical.
  • database (DB) - A database is a collection of information that is organized so that it can be easily accessed, managed and updated.
  • database abstraction layer - A database abstraction layer is a simplified representation of a database in the form of a written description or a diagram.
  • database activity monitoring (DAM) - Database activity monitoring (DAM) systems monitor and record activity in a database and then generate alerts for anything unusual.
  • database automation - Database automation is the use of unattended processes and self-updating procedures for administrative tasks in a database.
  • database management system (DBMS) - A database management system (DBMS) is system software for creating and managing databases.
  • database marketing - Database marketing is a systematic approach to the gathering, consolidation and processing of consumer data.
  • database normalization - Database normalization is intrinsic to most relational database schemes.
  • database replication - Database replication is the frequent electronic copying of data from a database in one computer or server to a database in another -- so that all users share the same level of information.
  • database-agnostic - Database-agnostic is a term describing the capacity of software to function with any vendor’s database management system (DBMS).
  • Db2 - Db2 is a family of database management system (DBMS) products from IBM that serve a number of different operating system (OS) platforms.
  • deep analytics - Deep analytics is the application of sophisticated data processing techniques to yield information from large and typically multi-source data sets comprised of both unstructured and semi-structured data.
  • delimiter - In computer programming, a delimiter is a character that identifies the beginning or the end of a character string (a contiguous sequence of characters).
  • denormalization - Denormalization is the process of adding precomputed redundant data to an otherwise normalized relational database to improve read performance of the database.
  • digital photo album - A digital photo album is an application that allows the user to import graphic image files from a digital camera, memory card, scanner, or computer hard drive, to a central database.
  • dimension - In data warehousing, a dimension is a collection of reference information about a measurable event (fact).
  • dirty data - In a data warehouse, dirty data is a database record that contains errors.
  • distributed database - A distributed database is a database that consists of two or more files located in different sites either on the same network or on entirely different networks.
  • distributed ledger technology (DLT) - Distributed ledger technology (DLT) is a digital system for recording the transaction of assets in which the transactions and their details are recorded in multiple places at the same time.
  • distribution - In marketing, distribution is the process of moving a product from its manufacturing source to its customers.
  • document-oriented database - A document-oriented database is a type of NoSQL database in which data is stored in binary document files.
  • DSTP (Data Space Transfer Protocol) - DSTP (Data Space Transfer Protocol) is a protocol that is used to index and retrieve data from a number of databases, files, and other data structures using a key that can find all the related data about a particular object across all of the data.
  • Dublin Core - Dublin Core is an initiative to create a digital "library card catalog" for the Web.
  • ebXML (Electronic Business XML) - EbXML (Electronic Business XML or e-business XML) is a project to use the Extensible Markup Language (XML) to standardize the secure exchange of business data.
  • Eclipse (Eclipse Foundation) - Eclipse is a free, Java-based development platform known for its plugins that allow developers to develop and test code written in other programming languages.
  • employee self-service (ESS) - Employee self-service (ESS) is a widely used human resources technology that enables employees to perform many job-related functions, such as applying for reimbursement, updating personal information and accessing company benefits information -- which was once largely paper-based, or otherwise would have been maintained by management or administrative staff.
  • encoding and decoding - Encoding and decoding are used in many forms of communications, including computing, data communications, programming, digital electronics and human communications.
  • encryption key management - Encryption key management is the administration of tasks involved with protecting, storing, backing up and organizing encryption keys.
  • enhancement - In an information technology product, an enhancement is a noteworthy improvement to the product as part of a new version of it.
  • Entity Relationship Diagram (ERD) - An entity relationship diagram (ERD), also known as an entity relationship model, is a graphical representation that depicts relationships among people, objects, places, concepts or events within an information technology (IT) system.
  • Excel - Excel is a spreadsheet program from Microsoft and a component of its Office product group for business applications.
  • executable - In computers, to execute a program is to run the program in the computer, and, by implication, to start it to run.
  • export - In a personal computer application, to export is to convert a file into another format than the one it is currently in.
SearchNetworking
SearchSecurity
  • man in the browser (MitB)

    Man in the browser (MitB) is a security attack where the perpetrator installs a Trojan horse on the victim's computer that is ...

  • Patch Tuesday

    Patch Tuesday is the unofficial name of Microsoft's monthly scheduled release of security fixes for the Windows operating system ...

  • parameter tampering

    Parameter tampering is a type of web-based cyber attack in which certain parameters in a URL are changed without a user's ...

SearchCIO
  • e-business (electronic business)

    E-business (electronic business) is the conduct of business processes on the internet.

  • business resilience

    Business resilience is the ability an organization has to quickly adapt to disruptions while maintaining continuous business ...

  • chief procurement officer (CPO)

    The chief procurement officer, or CPO, leads an organization's procurement department and oversees the acquisitions of goods and ...

SearchHRSoftware
SearchCustomerExperience
  • clickstream data (clickstream analytics)

    Clickstream data and clickstream analytics are the processes involved in collecting, analyzing and reporting aggregate data about...

  • neuromarketing

    Neuromarketing is the study of how people's brains respond to advertising and other brand-related messages by scientifically ...

  • contextual marketing

    Contextual marketing is an online marketing strategy model in which people are served with targeted advertising based on their ...

Close