AI trends in 2020 marked by expectation shift and GPT-3

In the past year, AI hyperscalers got serious about their machine learning platforms, expectations were reset and transformer networks empowered the GPT-3 language model.

2020 was challenged by tragedy, political turmoil, the rise of remote work, economic uncertainty and of course, COVID-19. For artificial intelligence, however, 2020 has been a year of unassuming but encouraging maturity and development.

Growth in transformer networks, natural language processing, and machine learning platforms have sustained research and fervor in AI technologies despite widespread enterprise financial concerns. Here are the lessons and trends that defined a tumultuous year.

Evaluated failures and limitations

According to Kevin See, vice president of research at Lux Research, enterprises have learned a lot from failures in 2020 -- namely a failure to see widespread impact.

"Besides conventional tech companies, AI hasn't made the promised impact on most industries (transforming operations, products, business models)," See said. "Going into 2021, I expect these organizations to reflect on why that is."

Organizations have seen what machine learning, RPA and chatbots are capable of and better understand their limitations. This marks the beginning of a change across industries -- where targeted deployment and realistic expectations seem to be the aim of enterprises for the coming years.

AI timeline
AI continued its progression in 2020 with a notable advancement in transformer networks.

Expectations and conversations shifted

Kjell Carlsson, an analyst at Forrester Research, found that this year shifted conversations from theoretical to functional deployment of AI in enterprises.

"People are talking about the nuts and bolts and brass tacks of getting [AI] done," Carlsson said. "Versus how they're going to go in and change the future workforce of the enterprise."

He noticed that 2020 marked a change from high-level, vague assertions about AI to more realistic and practical conversations. This change has yet to reach a widespread tidal wave, but the bedrock of practicality over flashy promises has been laid in 2020.

Carlsson said that though the wider discussion isn't focused on specific strategies, many companies will move away from large AI goals to small implementations in 2021 or 2022.

"Instead of talking about [overall] AI company strategy, [companies] are talking about their NLP strategy within the AI umbrella," Carlsson said.

This is an important marker of artificial intelligence maturity, because it displays a better understanding of strategy and realistic applications. Separating AI into its different technologies and applications allows for a better understanding of expectations -- what ROI means, what boundaries of success look like, and the future of the workforce.

"You need to get to that level within the organization to make sense and talk about an overarching AI strategy," Carlsson said. "If you're talking about it at the level of AI, then it's, it's too high-level, it's too much of an agglomeration of too many different things to actually do anything with."

NLP advanced through transformer networks

See points to Natural Language Processing (NLP) as one of the technologies that gained a surprising amount of momentum this year.

"There are really varied use cases for NLP to leverage voice and text data, which can impact the daily life of consumers but also physical industries and healthcare," See said.

It's really GPT-3 that is going to burst into the public mind. It is now available for startups and the tech titans to use in a way that it wasn't before.
Kjell CarlssonAnalyst, Forrester Research

The power behind NLP is designed to handle sequential data and can process this data out of order. This permits parallelization while training and decreases the overall time it takes to train an AI or ML algorithm. Previously, projects with a timetable were difficult to tackle because the required amount of data to train on meant a significant delay. The development of deep learning transformer networks has changed that landscape -- modeling and discovering relationships and sequences between words with ease and speed.

"You know as long as you throw enough compute horsepower at them, you can solve these problems which have a time dimension to them," Carlsson said.

These transformer networks have become more relevant in 2020 through NLP development, and shined with the release of one of the most remarkable technologies to come out of the year: GPT-3.

Generative Pre-trained Transformer 3

One of the major developmental events of AI in 2020 is the release of Generative Pre-trained Transformer 3 (GPT-3) . This autoregressive language model produces humanlike text through deep learning and is one of the most relevant examples of transformer networks creating relationships between text.

"It's really GPT-3 that is going to burst into the public mind," Carlsson said. "It is now available for startups and the tech titans to use in a way that it wasn't before."

This model can perform sentiment analysis, topic mining, text summarization and continuously improve as more data becomes available. On top of this, the model can handle different tasks despite not being originally trained for them -- it's a very real example of machine learning.

"You end up with weird things like [GPT-3] models now suddenly learning how to count without anybody ever teaching it how to," Carlsson said. "Because it's been trained on so many other similar tasks that it can kind of figure it out."

Hyperscalers got serious about machine learning platforms

In recent years, tech giants such as Amazon, Google and Microsoft had taken a loose approach to machine learning services. According to Carlsson, there was no single place to take an organization through machine learning and fulfill each step -- instead, a strategy had to use many different hardware, software and chips.

"You had to cobble together everything from all of these different disparate machine learning services," Carlsson said.

If you wanted something that was scalable to do hyperparameter optimization, then you were in for great news. However, if you wanted to do all the steps to train and deploy a machine learning model, it was an exercise in frustration to find the right resources.

Carlsson sees 2020 as an about face from again two years ago where large organizations focused much more on partnering with Domino and SAS to provide the architecture enterprises need. Now, they are building out complete sets of AI tools that serve a variety of different people and guide them through the entire machine learning process.

"There's a battle to become the AI cloud partner of choice for enterprises," Carlsson said.

IT departments have gotten back into directing the strategy of enterprises and are looking for the right partner to ensure successful application of AI. They're trying to standardize on one cloud and one of the criteria for them is if this particular cloud provider can offer a more holistic AI approach.

Dig Deeper on Enterprise applications of AI

Business Analytics
CIO
Data Management
ERP
Close