Browse Definitions :
association rules What is data labeling?
X
Definition

dimensionality reduction

What is dimensionality reduction?

Dimensionality reduction is a process and technique to reduce the number of dimensions -- or features -- in a data set. The goal of dimensionality reduction is to decrease the data set's complexity by reducing the number of features while keeping the most important properties of the original data.

Dimensionality reduction is advantageous to AI developers or data professionals working with massive data sets, performing data visualization and analyzing complex data. It aids in the process of data compression, allowing the data to take up less storage space as well as reducing computation times. The technique is commonly used in machine learning (ML).

Different techniques, such as feature selection and feature extraction, are used to complete dimensionality reduction. Along with this, each technique uses several methods that simplify the modeling of complex problems, eliminate redundancy and reduce the possibility of the model overfitting.

Why is dimensionality reduction important for machine learning?

Machine learning requires large data sets to properly train and operate. Dimensionality reduction is a particularly useful way to prevent overfitting and to solve classification and regression problems.

This process is also useful to preserve the most relevant information while reducing the number of features in a data set. Dimensionality reduction removes irrelevant features from the data, as irrelevant data can decrease the accuracy of machine learning algorithms.

What are different techniques for dimensionality reduction?

There are two common dimensionality reduction techniques: feature selection and feature extraction.

  • In feature selection, small subsets of the most relevant features are chosen from a larger set of dimensional data to represent a model by filtering, wrapping or embedding. The goal here is to reduce the data set's dimensionality while keeping its most important features.
  • Feature extraction combines and transforms the data set's original features to create new features. The goal is to create a lower-dimensional data set that still has the data set's properties.

Feature selection uses different methods, such as the following:

  • The filter method. Filters a data set into a subset that only has the most relevant features of the original data set.
  • The wrapper method. Feeds features into an ML model to evaluate if a feature should be removed or added.
  • The embedded method. Evaluates the performance of each feature by checking training iterations of the ML model.

Feature extraction uses methods such as the following:

  • Principal component analysis (PCA). A statistical process that identifies smaller units of features from larger data sets. These small units are called principal components.
  • Linear discriminant analysis (LDA). A method that finds features that separate different classes of data the best.
  • T-distributed stochastic neighbor embedding (t-SNE). An unsupervised, nonlinear dimensionality reduction method that creates a probability distribution over pairs of objects and then creates a probability distribution over the points in a low-dimensional map.

Other methods used in dimensionality reduction include the following:

  • Factor analysis.
  • High correlation filter.
  • UMAP.
  • Random forest.
Examples of techniques and methods used in dimensionality reduction.
Dimensionality reduction can be enacted using a variety of techniques and methods.

Benefits and challenges of dimensionality reduction

Dimensionality reduction has benefits, such as the following:

  • Improved performance. Dimensionality reduction reduces the complexity of data, which reduces irrelevant data and improves performance.
  • Increase in visualization. High dimensional data is more difficult to visualize when compared to lower/simplified dimensional data.
  • Prevents overfitting. Higher dimensional data can lead to overfitting in ML models, which dimensionality reduction helps prevent.
  • Reduced storage space. Reduces require storage space as the process eliminates irrelevant data.

The process does come with downsides, however, such as the following:

  • Data loss. Dimensionality reduction should ideally have no data loss, as data can be recovered. However, the process might still result in some data loss, which can impact how training algorithms work.
  • Interpretability. It might be difficult to understand the relationships between original features and the reduced dimensions.
  • Computational complexity. Some reduction methods might be more computationally intensive than others.
  • Outliers. If not detected, data outliers might trouble the dimensionality reduction process.

To improve the performance of an ML model, dimensionality reduction can also be used as a data preparation step. Learn more data preparation steps for ML.

This was last updated in September 2023

Continue Reading About dimensionality reduction

Networking
  • What is wavelength?

    Wavelength is the distance between identical points, or adjacent crests, in the adjacent cycles of a waveform signal propagated ...

  • subnet (subnetwork)

    A subnet, or subnetwork, is a segmented piece of a larger network. More specifically, subnets are a logical partition of an IP ...

  • secure access service edge (SASE)

    Secure access service edge (SASE), pronounced sassy, is a cloud architecture model that bundles together network and cloud-native...

Security
CIO
  • What is a startup company?

    A startup company is a newly formed business with particular momentum behind it based on perceived demand for its product or ...

  • What is a CEO (chief executive officer)?

    A chief executive officer (CEO) is the highest-ranking position in an organization and responsible for implementing plans and ...

  • What is labor arbitrage?

    Labor arbitrage is the practice of searching for and then using the lowest-cost workforce to produce products or goods.

HRSoftware
  • organizational network analysis (ONA)

    Organizational network analysis (ONA) is a quantitative method for modeling and analyzing how communications, information, ...

  • HireVue

    HireVue is an enterprise video interviewing technology provider of a platform that lets recruiters and hiring managers screen ...

  • Human Resource Certification Institute (HRCI)

    Human Resource Certification Institute (HRCI) is a U.S.-based credentialing organization offering certifications to HR ...

Customer Experience
Close