OpenAI, Stability AI, Alphabet and other vendors of generative AI systems are dealing with the consequences of the generative AI explosion.
Months after the initial excitement over the release of generative AI tools such as Dall-E, ChatGPT, Bard and others, the euphoria is slowly fading as a new dread emerges among the creatives affected by the systems, which generate text and images on their own.
Authors Guild open letter
Most recently, on July 17, the Authors Guild released an open letter signed by noted authors including Margaret Atwood, Eloisa James and Rebecca Wells asking leaders of the generative AI movement to change course in how they use art creators' work to train generative AI systems.
The writers group called on the leaders of generative AI vendors to "mitigate the damage to our profession" by obtaining permission before using copyrighted material in generative AI programs, compensating writers for past and ongoing use of works in generative AI programs, and compensating writers fairly for the benefit of results in AI output.
Mary RasenbergerCEO, The Authors Guild
"We don't expect AI companies to completely trash the existing large language models and start over," Authors Guild CEO Mary Rasenberger said. "We know they're not going to do that. And we know that the courts aren't going to make them do that."
Instead, the guild proposes a collective extended license enabling generative AI companies to pay for authors' works that were previously used to train generative AI systems. The organization is also asking Congress to enact legislation enabling authors to opt out of their work being used to train AI systems.
The extended license would require authors to be represented by an organization that would offer blanket licenses to AI vendors, negotiate fees for the authors and distribute them so that authors can be compensated.
The letter comes as AI companies such as OpenAI, Meta and Google face lawsuits accusing them of infringing copyrighted materials. It also comes amid other calls for a pause in the development of AI systems and warnings about extinction due to AI.
However, instead of a lawsuit approach, the Authors Guild's first step is to try to get legislation in place while negotiating with generative AI vendors.
"If that fails, then yes, we will have to bring a lawsuit also," Rasenberger said, adding that she couldn't disclose which AI vendors the organization is talking to.
A need for legislation
While the creators of generative AI systems have a role to play, lawmakers must take a stance on copyrighted materials, Gartner analyst Avivah Litan said.
"There's not a very clear line now -- if you synthesize all these copyrighted materials to create something entirely new, is that fair use or not?" she said.
U.S. copyright laws currently allow generative AI work to be copyrighted as long as it looks significantly different from the original work.
However, many vendors need to be held accountable for keeping track of what work is copyrighted, Litan said.
"All these people spent their lives creating valuable original content, and if the new content coming out has a critical mass of the original content used to generate the new content, they should be compensated," Litan continued. "But there's no measurement. The technology got ahead of the rules."
Focusing on compensation
However, the idea of compensation is unsettling to Ethan Rutherford, author and associate professor of English at Trinity College.
"Authors in their work need to be protected under copyright law," Rutherford said. He added that while it's necessary to call for stronger copyright protection as it relates to writers and AI, the focus should not be solely on compensation.
"My worry is that what they're focusing on is how much money they're going to get as AI is using their work to build itself up," he said. "It feels like a shortsighted concern."
While it might seem like AI could create new types of literature that remove humans from the process, Rutherford said writers need to find a way to be indispensable regardless of AI.
However, for the Authors Guild and Rasenberger, the problem is more profound than compensation.
"It's about protecting our literature and arts," she said. The organization wants AI content to be labeled as AI generated and for vendors to disclose what materials they use to train their models.
And compensation would allow authors to continue to earn a living and create new content. Without new human content, AI technology will not have new materials to train on and will have to train on itself.
"It becomes reductive," Rasenberger said. "It doesn't function well anymore. Certainly, AI companies recognize they're going to need new content."
Esther Ajao is a news writer covering artificial intelligence software and systems.