will not use projects to train AI

After a terms of service update that infuriated artists, and an initial statement that added fuel to the fire, Adobe has made a clear statement about the new terms of use.

Adobe Creative Cloud logo

The past 48 hours have been tumultuous for Adobe. Early in the week of June 3, Adobe Creative Cloud users pointed out that the new terms of service allowed Adobe to do whatever it wanted with user projects.

We saw the furore and contacted Adobe about it. They then issued an ambiguous statement on the matter, saying that the terms had always been this way.

“Adobe has access to user content for a number of reasons, including the ability to deliver some of our most innovative cloud-based features, such as Photoshop Neural Filters and Background Removal in Adobe Express, and to take action on prohibited content,” the statement said. spokesman. the company said at the time. “Adobe cannot access, view, or listen to content stored locally on a user’s device.”

This didn’t help and said nothing at all about training AI. Especially considering that “Cloud” is the third word in the product name, and the statement didn’t address that at all.

So we wrote about it on Thursday, after giving them a chance to explain it for three days and getting nothing in return. On Thursday evening they finally said something concrete.

  • Access is required so that Adobe applications and services can perform the functions for which they are designed and used (such as opening and editing files for the user or creating thumbnails or a preview for sharing).
  • Access is necessary to deliver some of our most innovative cloud-based features, such as Photoshop Neural Filters, Liquid Mode, or Remove Background.
  • For content processed or stored on Adobe servers, Adobe may use technologies and other processes, including manual (human) review escalation, to screen for certain types of illegal content (such as child sexual abuse material) or other offensive content or behavior.

Thursday’s post specifically states that “Adobe Firefly Gen does not train AI models on customer content” and “Adobe will never take ownership of a customer’s work.”

This is all well and good. The latter was not actually up for discussion.

The first, however, is strangely specific. It’s good that they stated that they don’t train Firefly with these materials.

What would have been better is a general statement saying that they will not use it for Firefly and will not sell or license it to others to train their models. The other generative AI providers complain that properly licensing content to train models is too difficult, so it shouldn’t be necessary.

Adobe needs to be clear that they will not allow this.

The statement also doesn’t address the fact that the terms still appear to violate confidentiality agreements that artists may have signed should they use cloud-based, well, whatever, from Adobe. We’ll see if that is addressed.

Based on our brief conversations this morning with attorneys who specialize in these types of cases, this does not appear to match Thursday’s statement.

The company says it will clarify the acceptance of the terms of use to reflect the details of Thursday’s message. It is not clear when this will happen.

Adobe has an incredibly large platoon of lawyers, so it would have been better if they had thought about this sooner.

The court of public opinion will have the final say

We’re already seeing comments on social media that Adobe got caught with their hands in the cookie jar, hence the statement. We’re not sure, but either way, the damage has been done.

For now, we can say that we received a lot of emails overnight about our Thursday piece. Most of them have outlined multiple pain points with Adobe software that go beyond the terms of service update.

And they claim they don’t look back after a switch. So we’ll see if this statement changes user sentiment.

Interestingly, Adobe did not bother to send us an email about this statement. And they have yet to respond to our emails about this on Monday, Tuesday and Wednesday.