Browse Definitions :
Definition

dimensionality reduction

What is dimensionality reduction?

Dimensionality reduction is a process and technique to reduce the number of dimensions -- or features -- in a data set. The goal of dimensionality reduction is to decrease the data set's complexity by reducing the number of features while keeping the most important properties of the original data.

Dimensionality reduction is advantageous to AI developers or data professionals working with massive data sets, performing data visualization and analyzing complex data. It aids in the process of data compression, allowing the data to take up less storage space as well as reducing computation times. The technique is commonly used in machine learning (ML).

Different techniques, such as feature selection and feature extraction, are used to complete dimensionality reduction. Along with this, each technique uses several methods that simplify the modeling of complex problems, eliminate redundancy and reduce the possibility of the model overfitting.

Why is dimensionality reduction important for machine learning?

Machine learning requires large data sets to properly train and operate. Dimensionality reduction is a particularly useful way to prevent overfitting and to solve classification and regression problems.

This process is also useful to preserve the most relevant information while reducing the number of features in a data set. Dimensionality reduction removes irrelevant features from the data, as irrelevant data can decrease the accuracy of machine learning algorithms.

What are different techniques for dimensionality reduction?

There are two common dimensionality reduction techniques: feature selection and feature extraction.

  • In feature selection, small subsets of the most relevant features are chosen from a larger set of dimensional data to represent a model by filtering, wrapping or embedding. The goal here is to reduce the data set's dimensionality while keeping its most important features.
  • Feature extraction combines and transforms the data set's original features to create new features. The goal is to create a lower-dimensional data set that still has the data set's properties.

Feature selection uses different methods, such as the following:

  • The filter method. Filters a data set into a subset that only has the most relevant features of the original data set.
  • The wrapper method. Feeds features into an ML model to evaluate if a feature should be removed or added.
  • The embedded method. Evaluates the performance of each feature by checking training iterations of the ML model.

Feature extraction uses methods such as the following:

  • Principal component analysis (PCA). A statistical process that identifies smaller units of features from larger data sets. These small units are called principal components.
  • Linear discriminant analysis (LDA). A method that finds features that separate different classes of data the best.
  • T-distributed stochastic neighbor embedding (t-SNE). An unsupervised, nonlinear dimensionality reduction method that creates a probability distribution over pairs of objects and then creates a probability distribution over the points in a low-dimensional map.

Other methods used in dimensionality reduction include the following:

  • Factor analysis.
  • High correlation filter.
  • UMAP.
  • Random forest.
Examples of techniques and methods used in dimensionality reduction.
Dimensionality reduction can be enacted using a variety of techniques and methods.

Benefits and challenges of dimensionality reduction

Dimensionality reduction has benefits, such as the following:

  • Improved performance. Dimensionality reduction reduces the complexity of data, which reduces irrelevant data and improves performance.
  • Increase in visualization. High dimensional data is more difficult to visualize when compared to lower/simplified dimensional data.
  • Prevents overfitting. Higher dimensional data can lead to overfitting in ML models, which dimensionality reduction helps prevent.
  • Reduced storage space. Reduces require storage space as the process eliminates irrelevant data.

The process does come with downsides, however, such as the following:

  • Data loss. Dimensionality reduction should ideally have no data loss, as data can be recovered. However, the process might still result in some data loss, which can impact how training algorithms work.
  • Interpretability. It might be difficult to understand the relationships between original features and the reduced dimensions.
  • Computational complexity. Some reduction methods might be more computationally intensive than others.
  • Outliers. If not detected, data outliers might trouble the dimensionality reduction process.

To improve the performance of an ML model, dimensionality reduction can also be used as a data preparation step. Learn more data preparation steps for ML.

This was last updated in September 2023

Continue Reading About dimensionality reduction

Networking
  • User Datagram Protocol (UDP)

    User Datagram Protocol (UDP) is a communications protocol primarily used to establish low-latency and loss-tolerating connections...

  • Telnet

    Telnet is a network protocol used to virtually access a computer and provide a two-way, collaborative and text-based ...

  • big-endian and little-endian

    The term endianness describes the order in which computer memory stores a sequence of bytes.

Security
  • advanced persistent threat (APT)

    An advanced persistent threat (APT) is a prolonged and targeted cyber attack in which an intruder gains access to a network and ...

  • Mitre ATT&CK framework

    The Mitre ATT&CK (pronounced miter attack) framework is a free, globally accessible knowledge base that describes the latest ...

  • timing attack

    A timing attack is a type of side-channel attack that exploits the amount of time a computer process runs to gain knowledge about...

CIO
HRSoftware
  • employee resource group (ERG)

    An employee resource group is a workplace club or more formally realized affinity group organized around a shared interest or ...

  • employee training and development

    Employee training and development is a set of activities and programs designed to enhance the knowledge, skills and abilities of ...

  • employee sentiment analysis

    Employee sentiment analysis is the use of natural language processing and other AI techniques to automatically analyze employee ...

Customer Experience
  • customer profiling

    Customer profiling is the detailed and systematic process of constructing a clear portrait of a company's ideal customer by ...

  • customer insight (consumer insight)

    Customer insight, also known as consumer insight, is the understanding and interpretation of customer data, behaviors and ...

  • buyer persona

    A buyer persona is a composite representation of a specific type of customer in a market segment.

Close