0

Photo-sharing community EyeEm will license users’ photos to train AI if they don’t delete them | TechCrunch

ieemBerlin-based photo-sharing community dropped out last year to spanish company freepik After bankruptcy, it is now licensing photos of its users to train AI models. Earlier this month, the company informed users via email that it was adding a new clause to its terms and conditions that allows it to “train, develop, and improve software, algorithms, and machine-learning models.” “will provide users the right to upload content.” , Users were given 30 days to opt out by deleting all of their content from EyeEm’s platform. Otherwise, they were consenting to this use case for their work.

at the time of 2023 acquisition, EyeEm’s photo library consisted of 160 million images and approximately 150,000 users. The company said it would merge its community with Freepik over time. According to statistics, despite its decline, about 30,000 people are still downloading it every month. appfigure,

Instagram was once thought of as a potential challenger – or at least “Europe’s Instagram” – TechCrunch’s Ingrid Lunden Before the sale to Freepik, EyeEm’s staff was down to three informed of, Freepik CEO Joaquin Cuenca Abela hinted at the company’s potential plans for EyeEm, saying it will explore how to bring more AI into the equation for creators on the platform.

As it turns out, this meant selling one’s work to train AI models.

Now, EyeEm’s Updated Terms and Conditions Reads as follows:

8.1 Granting Rights – IEEM Community

By uploading content to the EyeEm Community, you grant us a non-exclusive, worldwide license to reproduce, distribute, publicly display, adapt, adapt, create derivative works from, communicate to the public and/or promote such content in connection with your content. , grant transferable and sublicensable rights. Material.

This includes a sub-licensable and transferable right to use your content specifically for the training, development and improvement of software, algorithms and machine learning models. If you do not agree with this, you must not add your content to the EyeEm Community.

The rights granted in this Section 8.1 with respect to Your Content remain valid until it is completely removed from the EyeEm Community and Partner Platforms in accordance with Section 13. You may request removal of your content at any time. The conditions for this can be found in section 13.

Section 13 details a complex process for deletion that starts with simply deleting the photo first — which will not affect content that was previously shared with EyeEm Magazine or on social media, the company notes. To remove content from EyeEm Market (where photographers sell their photos) or other content platforms, users must submit a request to [email protected] and provide the Content ID number for the photos they wish to remove. And should it be. Removed from their account, as well, or simply from EyeEm Market.

Worth noting is that the notice states that these removals from the EyeEm marketplace and partner platforms may take up to 180 days. Yes, that’s right: Requested deletion takes up to 180 days but users only have 30 days to opt out. This means that the only option is to manually delete the photos one by one.

What’s worse, the company also says:

You hereby acknowledge and agree that your authorization for EyeEm to market and license Your Content in accordance with Sections 8 and 10 will remain valid so long as your authorization from EyeEm and all Partner Platforms within the time frame stated above The content is not removed. All license agreements entered into before complete deletion and the use rights granted by them remain unaffected by the deletion or deletion request.

Section 8 is where the licensing rights to train AI are detailed. In Section 10, EyeEm informs users that if they delete their account they will lose the right to any payment for their work – something that users should do to avoid having their data fed into AI models. I can think. got it!

EyeEm’s move is an example of how AI models are being trained on users’ content, sometimes without their explicit consent. Although EyeEm did offer a sort of opt-out process, any photographer who missed the announcement would have lost the right to direct how their photos would be used going forward. Given that EyeEm’s status as a popular Instagram alternative has declined significantly over the years, many photographers may have forgotten they ever used it in the first place. They certainly would have ignored the email if it wasn’t already in a spam folder somewhere.

Those who noticed the changes were upset they were only given 30 days’ notice and no Options to bulk delete their contributionsWhich has made getting out more painful.

Requests for comment sent to EyeEm were not immediately confirmed, but given this countdown was a 30-day deadline, we have opted to publish before hearing a response.

It is because of this type of dishonest behavior that users today are considering moving towards the open social web. federal forum, pixelfadewhich runs on the same ActivityPub protocol that powers mastodonEyeEm is taking advantage of the situation to attract users.

In a post on its official account, Pixelfed announced “We will never use your images to help train an AI model. Privacy first, pixels forever.”


photo-sharing-community-eyeem-will-license-users-photos-to-train-ai-if-they-dont-delete-them-techcrunch