Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Privacy

EyeEm Will License Users' Photos To Train AI If They Don't Delete Them 27

Sarah Perez reports via TechCrunch: EyeEm, the Berlin-based photo-sharing community that exited last year to Spanish company Freepik after going bankrupt, is now licensing its users' photos to train AI models. Earlier this month, the company informed users via email that it was adding a new clause to its Terms & Conditions that would grant it the rights to upload users' content to "train, develop, and improve software, algorithms, and machine-learning models." Users were given 30 days to opt out by removing all their content from EyeEm's platform. Otherwise, they were consenting to this use case for their work.

At the time of its 2023 acquisition, EyeEm's photo library included 160 million images and nearly 150,000 users. The company said it would merge its community with Freepik's over time. Despite its decline, almost 30,000 people are still downloading it each month, according to data from Appfigures. Once thought of as a possible challenger to Instagram -- or at least "Europe's Instagram" -- EyeEm had dwindled to a staff of three before selling to Freepik, TechCrunch's Ingrid Lunden previously reported. Joaquin Cuenca Abela, CEO of Freepik, hinted at the company's possible plans for EyeEm, saying it would explore how to bring more AI into the equation for creators on the platform. As it turns out, that meant selling their work to train AI models. [...]

Of note, the notice says that these deletions from EyeEm market and partner platforms could take up to 180 days. Yes, that's right: Requested deletions take up to 180 days but users only have 30 days to opt out. That means the only option is manually deleting photos one by one. Worse still, the company adds that: "You hereby acknowledge and agree that your authorization for EyeEm to market and license your Content according to sections 8 and 10 will remain valid until the Content is deleted from EyeEm and all partner platforms within the time frame indicated above. All license agreements entered into before complete deletion and the rights of use granted thereby remain unaffected by the request for deletion or the deletion." Section 8 is where licensing rights to train AI are detailed. In Section 10, EyeEm informs users they will forgo their right to any payouts for their work if they delete their account -- something users may think to do to avoid having their data fed to AI models. Gotcha!
This discussion has been archived. No new comments can be posted.

EyeEm Will License Users' Photos To Train AI If They Don't Delete Them

Comments Filter:
  • GDPR violation? (Score:4, Informative)

    by DrMrLordX ( 559371 ) on Saturday April 27, 2024 @06:14AM (#64429288)

    Um isn't this nearing a GDPR violation? Aren't you supposed to delete user data on request, instead of forcing them to laboriously delete it themselves?

    • by qbast ( 1265706 ) on Saturday April 27, 2024 @06:27AM (#64429294)
      It is a violation. As is 180 days to process it - company cannot take more than 30 days to delete data upon request. So I guess this idiocy will get squashed soon
    • by allo ( 1728082 ) on Saturday April 27, 2024 @06:55AM (#64429308)

      Try to ask via GDPR and it will be faster. Even US companies can suddenly speed up things that are supposed to be slow if you tell them the relevant part of the law.

    • by Calydor ( 739835 )

      Not just that, but 'Berlin-based' is going to make it even worse for them. Germany is extremely strict when it comes to photo rights. Last I heard (and it may have changed since though I doubt it) dashcams are illegal in Germany for that very reason.

      • Perhaps if Germany went ahead and jailed the perps(CEO,Board & Legal) some of this might stop. Does GDPR have any ability to incarcerate violators?
      • Dashcams are not illegal.
        Illegal is to post your videos on Facebook and such.

    • Um isn't this nearing a GDPR violation?

      I'm pretty sure that the whole "Default to permission granted unless you take action" thing is a direct violation of GDPR in and of itself.

  • by TheNameOfNick ( 7286618 ) on Saturday April 27, 2024 @06:58AM (#64429314)

    Photos will be used to train AI, even the ones you deleted. (Will be? Have been.) Some lawyers will get rich over suing for the obviously illegal "if you don't do something exceedingly difficult, we'll assume you agree to give us your firstborn" type of bait and switch. It will still be worth it for every company involved because AI. Politicians will still cheer for anyone pretending to be a challenger to established foreign megacorps. Nobody will learn anything, especially not the users. Business as usual.

  • by sonamchauhan ( 587356 ) <`sonamc' `at' `gmail.com'> on Saturday April 27, 2024 @07:07AM (#64429322) Journal

    What if the user dies? Is silence consent to the site's unilaterally changed T&Cs?

    What if a modelling agency does this:?

    "Look, so-and-so just died. Quick! Let's change our T&Cs to grab his AI voice and likeness acting rights for-evah!"

    • Tine to enact a law that states that terms of service has to be actively accepted by both parties or the old tos will stay - and that the service can't be revoked if not accepting the new tos.
      Only exception would be for legal reasons.

  • by cascadingstylesheet ( 140919 ) on Saturday April 27, 2024 @07:55AM (#64429358) Journal
    ... you couldn't do stuff like this, in euro-paradises like Berlin?
  • by sinij ( 911942 ) on Saturday April 27, 2024 @08:14AM (#64429384)
    Wait until Google, FB and other companies figure out that declaring sham bankruptcy and transferring servers to another numbered company allows them to void all user protections. It won't be just pictures, it will be your tax returns and private emails used for training AI. Which will in turn be used to manipulate your consumer habits, voting patterns, and even personal relationships.
    • by cusco ( 717999 )

      You say "Legislation is needed", but who do you think is going to be responsible for writing and passing that legislation? Some of the people most interested in using that data to manipulate the public for their own personal and party gains.

      Yeah, good luck with that.

  • by bradley13 ( 1118935 ) on Saturday April 27, 2024 @08:20AM (#64429388) Homepage

    the company informed users via email that it was adding a new clause to its Terms & Conditions... Users were given 30 days to opt out.

    Um, no. You don't get to redefine contract terms with an "opt out". They're letting themselves in for a world of hurt (and court cases) if they do this.

  • Even if it is locally stored on your computer, assume malware will copy and train on it if your computer is connected to the internet. Only airgapped computers running non proprietary operating systems with encrypted physical media will be safe, and even then the data will only need to be copied somewhere once and it will "leak" into the AI bots scanning the internet.
  • by garryknight ( 1190179 ) <garryknight@gmai l . c om> on Saturday April 27, 2024 @10:35AM (#64429494)

    I just tried to log in, just in case I'd left some photos on there and got a message telling they can't log me in and that I should "check my input". I did. It's the same email and password it's been since the start, all those years ago. What *is* going on with this company?!

  • This is a major eye-opener! EyeEm's move to use users' photos for AI training without explicit consent is definitely concerning. It's a reminder of how important it is to read those terms and conditions, even if they seem like a drag. As a law student, I recently tackled an assignment on digital privacy rights and the fine print in user agreements. It's crazy how these companies can slip in clauses like this. If you ever need help navigating these murky waters, consider checking out some law assignment hel

Radioactive cats have 18 half-lives.

Working...