[ad_1]
Much has changed in the AI world in the last week. But as turmoil rages over the chaos at OpenAI, another major moment went comparatively overlooked because it occurred just before the world’s leading AI company fell to pieces.
On November 15, two days before the OpenAI debacle began, Ed Newton-Rex, who was vice president of audio at Stability AI for just over a year, resigned from his position leading the audio team at the company over a disagreement over copyright.
“It’s not so much about Stability per se as it is about every large tech company at the moment,” Newton-Rex tells Fast Company. “Every large company takes basically the same position.” That position is that they’re able to do what they want with training models because of the fair use doctrine, including training on data scraped from the internet or other databases without consent.
That jars with Newton-Rex. “I’ve always fought against this in my career,” he says. He points out that even the terminology allows for some distancing between what’s actually happening, and what AI companies claim is happening. Referring to songs, artworks, and writing as “data” disconnects it from the act of creativity. “Most normal people would think of [it] as creative output,” he says.
Newton-Rex decided to quit his position at Stability AI, overseeing the company’s generative AI audio team, in late October when his organization—alongside others—submitted its response to the U.S. Copyright Office’s inquiry into artificial intelligence. “When everyone submitted their pieces, it became very clear that many companies are still relying on the fair use argument, and trying to justify that to the US Copyright Office,” he says. “It’s just what I really disagree with. So I feel like I did resign from Stability, but I also feel like I resigned from a group of large AI companies who will take the same approach.”
The departure was amicable, Newton-Rex says, declining to go into details. (Emad Mostaque, the founder and CEO of Stability AI, responded to Newton-Rex’s tweet announcing his departure on X just over an hour after it was made, linking to Stability AI’s justification for why it believes fair use is sufficient justification for its use.)
Newton-Rex respects the AI companies’ decisions, even if he doesn’t agree with them. “I think they’re being genuine,” he says. “I think you should never assume malice.” He believes companies genuinely, rather than cynically, believe that they’re in the right to deploy the fair use exemption in this way. “The fair use exemption is open to interpretation,” he explains. “There’s no clear hard and fast line in the sand.”
One of the defenses that AI companies use when claiming fair use is that their tools are transformative. “In their eyes, what they’re doing is [providing] an assistive technology to creators,” says Newton-Rex. “So obviously, they talk a lot about the fact that they’re helping creators by building these assistive creation tools. And they’re not wrong.” However, that’s not the whole picture, the former worker says. “The problem in my mind is that, inevitably, everyone focuses on the bits of interpretation that work well for them. And the same models that can be used to assist creators are getting so good now, but they clearly can also be used to replace the market for that original work.”
Newton-Rex says that it’s this element—which is one of the four pillars that the US Copyright Office looks at that affects whether companies can claim fair use of copyrighted material or not—which undercuts AI companies’ claims in his mind.
“Ultimately, I don’t think people are being evil,” he says. “I just think it’s a different interpretation. And it’s not an interpretation that I think will stand over time. It’s not an interpretation I’m happy to get on board with.”
The former vice president of audio remains proud of the work he and his team did at Stability AI, in particular Stable Audio, which allowed users to create new music using artificial intelligence. “We built a very good product that’s gone down very well,” he says. It was named one of TIME’s best inventions of the year and we’re really proud of the work we did.” But despite that, he felt uneasy at remaining with the company given its stated position on the use of copyrighted materials. The letter to the US Copyright Office was the straw that broke the camel’s back. “That’s a real opportunity for you to sit down and ask yourself is something I support?” he says. “And, you know, for me, the answer was no.”
While not wanting to betray former colleagues’ trust and discuss internal conversations within Stability AI, Newton-Rex says he was never asked to contribute to the drafting of the response from Stability AI to the Copyright Office—but didn’t expect to be asked, in large part because his position on fair use was known to those within the company. He feels that he was likely an outlier within Stability AI for his position, and within the wider sector. “Frankly, clearly my opinion here doesn’t align with the generative AI industry’s current way of doing things,” he says. “Across the industry, I’m pretty confident that I’m in the minority.”
Because he’s in that minority view, and because the technology has advanced so quickly, Newton-Rex is concerned about the future. “It’s going to be very, very bad for creators,” he says. “Ultimately, I don’t think that you necessarily should or can slow down the march of technology. And I am certainly not a Luddite. I don’t think we should just go and destroy all these systems because they’re going to take jobs,” he says. “But I do think that you need to go about it in the right way. I do think you need to take a considered approach and make sure you’re doing things in a thoughtful way—and certainly a legal way.”
Newton-Rex hopes that courts and lawmakers will help establish and enshrine the rights of creators in the post-generative AI world. “If the rights of creators are not upheld, I think it’s going to be a very rapid and detrimental thing for the creative industries,” he says. “And for individual creators who’ve ultimately been brought up believing and being told that their copyright might be worth something.” He would rather see an agreement between Silicon Valley firms and creators broached—perhaps with the nudge of regulation—that agrees on “how these technologies can be win-win, at least in the near term.”
It’s part of the reason he quit and spoke out, he says. “Many people will say: ‘Oh, Pandora’s box has been opened, and there’s no going back,’” he explains. “There’s no reason that we can’t reframe how we train these models. I don’t think it’s too late. I do think that we need to have a rapid and society wide conversation about the contract between creators and these companies.”
But it needs to be done quickly. “I’m very worried about it,” he admits. “If I wasn’t worried about it, I wouldn’t have quit and I wouldn’t have publicly said something.” He confesses to not feeling great publicly saying these things—“but sometimes something’s got to be said.”
[ad_2]
Source link