Getty Images Plus

Generative AI as a copilot for finance and other sectors

While many fear that the popularity of large language models could lead to job loss and replacement, some industries such as finance and education are using AI to augment workers.

NEW YORK -- A year after the introduction of AI chatbot ChatGPT, many organizations have realized that a key value of generative AI systems is their ability to augment humans' work rather than replace workers.

As generative AI has grown quickly in popularity and many vendors released their own generative AI chatbots and platforms, it has seemed at times that the popular technology would replace workers such as artists, lawyers, white-collar office employees and even writers.

Similarly, there were concerns that the new technology was not secure enough to use in an enterprise environment, especially in industries working with personal and sensitive data.

However, amid the barrage of new generative AI tools, the landscape appears to be settling down for enterprises looking to benefit from the technology as a way to supplement their existing workforce.

Edward Jones

For financial advisory and wealth management firm Edward Jones, the value of generative AI has come from using the technology to help advisers support clients while removing some of the drudgery from work, said Shannon Favazza, principal and head of analytics at the firm.

"We think AI can be a great copilot to the adviser [and] augment that adviser's intelligence to make the adviser more productive," Favazza said during a presentation at the AI Summit New York 2023 conference on Dec. 7. "AI brings a great opportunity for advisers to become more productive and be able to serve clients and serve investors who desperately need financial advice and otherwise wouldn't have access to it."

We want people to say, 'Hey, this is the copilot. This is not a job replacement.'
Shannon FavazzaPrincipal and head of analytics, Edward Jones

Edward Jones took steps internally to help financial advisers and C-suite executives see the value from this application of generative AI in finance, Favazza said.

For example, using GPT 3.5 from OpenAI, Favazza and her team created generative AI-powered chatbots for Edward Jones employees to experiment with.

"The point [we made is that] the future is here, it's now, and we're able to do these things -- and everybody else is able to do these things," Favazza said.

The goal was to boost trust in generative AI, she said.

"There's a lot of natural distrust by what people see in the news," she continued. "We want people to say, 'Hey, this is the copilot. This is not a job replacement.'"

Edward Jones' Shannon Favazza speaks at AI Summit New York.
Shannon Favazza, principal and head of analytics at Edward Jones, discusses the firm's approach to generative AI.

Moody's Analytics

Like Edward Jones, risk analytics firm Moody's Analytics -- a subsidiary of finance giant Moody's -- took an internal approach to generative AI before introducing a product to customers.

While advanced AI technologies such as quantum computing and blockchain have long been a part of Moody's Analytics' IT arsenal, generative AI has spun off many complex models. That can be challenging for a company with large data sets that is concerned about data privacy and security, said Caroline Casey, general manager for customer experience and innovation at the company, in an interview.

Before releasing its Research Assistant product on Dec. 4, Moody's created an internal copilot product -- not to be confused with Microsoft Copilot. Research Assistant is a search and analytical tool built on Azure OpenAI and uses OpenAI's GPT-4.

"We know that the purpose of this is not to replace a human," Casey said. "It's to take out the kind of mundane work -- the trying to find information, the retrieval, the searching -- and actually help them to focus on where they've got the best expertise."

Moody's began its journey in the summer after the CEO encouraged all employees of Moody's Corp. to be innovators.

Bouncing off that idea, the finance company created a cross-organizational team that included members from Moody's Analytics and Moody's Investors Service called the generative intelligence group. This group helped build the copilot platform that enabled all parts of the organization to experiment with generative AI technology.

"It was important for us as well to make sure everyone was on the journey," Casey said.

The first stage of building the platform was making sure it was secure. To do this, Moody's used Microsoft Azure OpenAI Service.

"The key for us and a lot of organizations is we don't want our proprietary data to essentially train someone else's language model," Casey said.

Therefore, although OpenAI's GPT-3.5 powered Moody's internal copilot, the organization felt confident that it was not compromising its confidential data.

Azure OpenAI is meant for enterprise use and has added security embedded within it. Meanwhile, ChatGPT is a little risky to enterprises that do not want their proprietary data to be used to train the model.

The second stage was educating all employees with mandatory training for how to use generative AI. The training consisted of discussions about generative AI ethics, governance and prompt writing.

Moody's also set up an idea board enabling everyone in the organization to submit their ideas about generative AI. To date, employees have submitted 500 ideas to the board, with 20 of those ideas now being implemented in one way or another externally, Casey said.

"That's enabled us to work in a really distributed way and get things done much faster," Casey said.

It also enabled different units within the parent company to remain connected by the underlying technologies such as Meta's Llama.

Avatar tutor

Finance is among a number of industries experimenting with using generative AI as a copilot.

David Grewell, dean of the College of Engineering and Engineering Technology at Northern Illinois University, came to the conference to find a partner for a generative AI product idea he's developing for students. He is looking to develop an avatar tutor, he said in an interview.

"Often, when I see students struggle with fundamentals, I want to do what we can to make sure they're successful," Grewell said.

The avatar would augment students' learning and walk them through the process of getting a solution to certain engineering problems, while ensuring the generative AI tool is trustworthy.

"The other thing is to make sure we keep the AI within the rails so the student can't trick it," Grewell said.

At the conference, Grewell looked at conversational AI systems such as Amelia and IBM Watsonx.

"I want an interface that's easy for the student," he said. "I don't want them struggling learning how to use it. I want the interface to be seamless and totally intuitive."

Esther Ajao is a TechTarget Editorial news writer covering artificial intelligence software and systems.

Next Steps

Businesses confront reality of generative AI in finance

Dig Deeper on Enterprise applications of AI

Business Analytics
CIO
Data Management
ERP
Close