sdecoret - stock.adobe.com
Listen to this article. This audio was generated by AI.
Generative AI has already begun its disruption and has eliminated jobs in some vertical industries, according to the inventor of ChatGPT. But he said he believes its potential will eventually grow businesses and change enterprise AI to help workers achieve more than they do today.
The longtime Silicon Valley friends addressed the hopes -- and fears -- of CIOs in attendance at the online meeting. Levie asked Altman if he had any idea that ChatGPT, which now claims 200 million daily users, would reach mass adoption so quickly.
"I was pretty sure, but I also believe that no amount of testing ever quite replaces reality," said Altman, who added that internal tests showed the GPT-3 large language model underpinning ChatGPT was less frustrating to use than its predecessors. "You can feel good about it, but you never really know until you put it out into the world."
Generative AI's future in enterprise IT
Altman acknowledged that, on one hand, people have lost jobs to AI as the business world learns how to use tools such as ChatGPT to improve efficiency. On the other, he said he believes AI will eventually take mundane tasks off workers' plates and give them time to come up with better, more creative ideas in their jobs that lead to better products and services they can't imagine now.
Sam AltmanFounder and CEO, OpenAI
"I think there's going to be more jobs, better jobs in the future -- way more quality of life, way more wealth than we could possibly understand," Altman said. "But it is going to be really different, and that is not great consolation to anyone who's losing their job of today."
He added that governments in different countries will have to settle on how they regulate AI in some applications, such as for healthcare and protecting consumers from too-persuasive AI tools. Levie told Altman that the cost of AI has to come down, too, for businesses to more effectively use it.
Altman predicted that AI costs will lower, but those reductions will be tempered because added functionality will require more processing power. It's analogous to why smartphones will never be able to run for two days on one battery charge, despite the efficiency and technology improvements built into each new model, Altman said: We demand new features, and they eat up more power.
Melody Brue, vice president and principal analyst at Moor Insights & Strategy, agreed that overall enterprise AI costs should eventually come down from present levels. It isn't likely to become a runaway energy drain, though, such as non-fungible tokens or cryptomining. That's because AI developers are figuring out how to limit queries to smaller data sets as they better understand the context behind the questions they're asked.
"That's kind of where I think it's headed," Brue said. "There are a lot of really smart scientists and engineers who are working on the designs of these models -- and the chips that power them -- and the big focus is keeping power usage down. I think that that will inevitably make AI less expensive."
ChatGPT's immediate disruption
A year ago, Levie said, people were going about their lives, business as usual. Then, on Nov. 30, Altman's company dropped ChatGPT.
"The world explodes, everybody's product roadmap is completely altered, everybody's entire vision for the future of work is altered," Levie said.
To that end, Box developed a bevy of tools in the last year that enable its users to surface content with natural language queries via OpenAI's technology. Users can deploy the AI to do things such as automate the completion of common documents including requests for proposal -- which Box said contain an average of 77 questions and typically take customers a day and a half for an employee to manually fill out. The company also released Box Hubs, which enables users to publish that content to employee-facing libraries.
Looking into the future, Altman said OpenAI is working on new models that offer capabilities for which the most users are asking: personalized AI that understands the individual in context, integration with whatever apps they're using and smarter AI that offers more accurate feedback.
"They want a more reliable model [that] does what they want more often on the first try, [that] doesn't hallucinate and make up s--- unless they're asking for it," Altman said. "Those are the three areas that we're feeling the most pull from the market to try to excel in and [are] very aligned with our roadmap. We'll get better on all of those."
Don Fluckinger covers digital experience management, end-user computing and assorted other topics for TechTarget Editorial.