The CPU has long been the standard chip to perform most computing tasks, including analytics. However, specialized computing and chip-based servers can now offload graphics processing tasks to a dedicated GPU, which improves overall performance.
Microprocessor CPU limits gave rise to specialized chips such as the GPU, the DPU or the FPU -- sometimes called a math coprocessor, which handles floating-point mathematics. Such units free up the CPU to focus on more generalized processing tasks.
GPUs for data analytics
GPUs stuck strictly to graphical tasks for a long time, according to Greg Schulz, an independent IT analyst; the current interest in using the GPU for other kinds of processing is relatively new. Graphical processing requires math-intensive workloads, but the GPU's ability to process heavy mathematical tasks makes sense for a variety of other purposes. For example, rendering a 3D image requires matrix multiplication -- a kind of math that also proves useful for deep learning and analysis. However, those advanced GPU capabilities aren't a good fit for simply querying data in a database or data warehouse, according to Mike Gualtieri, principal analyst at Forrester Research, a research and advisory company based out of Cambridge, Mass.
Vendors such as Nvidia aim to use GPUs to dramatically accelerate training deep learning algorithms in particular. In addition to deep learning, GPUs also speed up tasks that involve inspection, searching through image databases and natural language processing. As GPUs become more common, they also become a more cost-effective way to handle such tasks.
Mathias GolombekCTO, Exasol
"GPUs enable data scientists to spend more time focused on value-added tasks and experiences and [deal with] fewer frustrations stemming from slow-performing systems and tools," said Mathias Golombek, CTO at Exasol, a high-speed database company based in Nuremberg, Germany.
On the other hand, not all tasks are a good fit for GPUs. Much of the GPU's popularity comes from its ability to offload certain intensive tasks from the CPU, but CPUs still suit certain data analytics tasks better. For example, running SQL analytics queries against a big data set requires a CPU's in-memory processing. The best bet for data analytics is to use both CPUs and GPUs together.
GPU vs. CPU: How they stack up
When it comes to data analytics, GPUs can handle several tasks at once because of their massive parallelism. However, CPUs are more versatile in the tasks they can perform, because GPUs usually have limited applicability for crunching data.
Instead of deciding between CPUs vs. GPUs for data analytics, organizations should consider whether they can use GPUs as an accelerator to achieve higher performance across the board. For instance, GPUs can quicken the development, training and refining of data science models because model training makes it easy to parallelize and use a GPU. This also keeps CPUs from dealing with heavy and complex model training tasks.
Organizations can also more easily test and experiment with GPUs because big cloud providers increasingly offer GPU services. AWS, Microsoft Azure and Google Cloud Platform all offer GPU instances, usually for AI workloads. GPUs' specialized purpose and increasing popularity with major vendors could spawn another generation of chips to perform more specialized analytical learning tasks. For example, Google even has its own proprietary Tensor Processing Unit for such tasks.