Yet again, US courts side with AI, let big tech get away with stealing.
Carter spent years building up Diversity Photos, which he founded because minority communities have been historically overlooked in stock photography. Carter says the company invested heavily in recruiting real, everyday people and obtained proper consent for the photos.
After launching in 2016, Diversity Photos made strong growth and partnered with Adobe in 2018 via a Stock Contributor Agreement. “The deal was a revenue-share arrangement,” Carter explains. “We’d supply the content, Adobe would distribute it through Adobe Stock, and we’d both benefit. We granted Adobe a license to distribute and promote our work to their end customers for mutual benefit — it literally says this in the contract.”
“So when I found out that Adobe had fed our entire library into their AI training pipeline without asking, without a separate license, without any compensation — it was devastating,” Carter says. “They didn’t just use a few images. They ingested our content and used it to build products that now directly compete with us. Adobe’s AI can now generate the same kind of diverse imagery that we spent years and significant resources creating. They took our competitive advantage and turned it into their feature.”
Carter points out that the entire value proposition of Diversity Photos is its scarcity, along with the quality of the content. If a tech company wants to use the archive to train an AI model, Carter will charge a perpetual license premium.
After Adobe informed Carter that it had a right to train its AI with Diversity Photos, the company offered $1,173.93 as a bonus fee. “Not compensation, not a licensing fee — a bonus,” says Carter. “As if they were doing me a favor. And they made it clear they didn’t even believe they were required to pay that under the agreement. … It showed a fundamental disrespect for the value of what we’d created and the communities we represent,” he says.
Carter and his attorney filed for arbitration in June 2024. Adobe then moved for a summary judgment on the case, which was fully granted except for one of Diversity Photos’ claims: negligence. The negligence claim stated that Adobe put Diversity Photos’ archive online without any watermarks or copyright management information. In doing so, other AI companies such as Midjourney, Stability AI, and Google were also able to train on the photo library without compensation. But even on that claim, Carter still didn’t win.
“We believe the arbitration process was fundamentally flawed,” Carter says. “The arbitrator refused to address the cost issue that was contractually required before ruling on Adobe’s motion to dismiss everything.”
Adobe has since moved to have the arbitration award confirmed in court as precedent for future cases. “This has nothing to do with any of our claims so he ruled on something that doesn’t pertain to our case but will be used and referenced against creators for decades,” Carter adds. “We’re asking the court to vacate the award on multiple grounds under California’s arbitration statute.”
Back when Carter signed his agreement with Adobe in 2018, generative AI products didn’t exist. “The contract I signed gave Adobe a license to use my images for ‘developing new features and services [to promote my work],’” Carter says. But while he assumed that wording was there for Adobe to improve its platform, the company assumed otherwise.
“Adobe’s argument? The word ‘new’ means they can do anything new. Anything. Including something that didn’t exist when I signed the contract. Including AI training. Including building a tool that directly competes with the very content I licensed to them… That’s why we’re fighting to have this ruling vacated. Not just for Diversity Photos, but because if this interpretation stands, it sets a precedent that guts creator rights across the board. No contract from the pre-AI era should be interpreted as a blank check for AI training. The creators who signed those agreements never could have imagined this use, and the platforms know it.”
“The irony is that our content is valuable precisely because it’s hard to create — real people, real diversity, real consent… Generative AI threatens to commoditize all of that. And the companies building these AI models are profiting from our work without compensating us.”