Ever app users uploaded billions of photos, unaware they were being used to build a facial recognition system

Ever app users uploaded billions of photos, unaware they were being used to build a facial recognition system

Ever claims to be a company “dedicated to helping you capture and rediscover your life’s memories.”

Signing-up for an account on your smartphone or desktop, you can grant Ever access to pictures stored in your instant messages, email archive, Dropbox, Instagram, and Facebook account. And why would you want to do that?

Well, Ever’s site gives you a jolly good reason:

Sign up to our free newsletter.
Security news, advice, and tips.

Get free, unlimited private backup of all your life’s memories, from anywhere they exist.

Sound too good to be true? Well, you might be right about that.

An NBC News investigation has uncovered that Ever isn’t being completely altruistic. You see, they want to do something with your photos that you might not feel entirely comfortable with:

What isn’t obvious on Ever’s website or app — except for a brief reference that was added to the privacy policy after NBC News reached out to the company in April — is that the photos people share are used to train the company’s facial recognition system, and that Ever then offers to sell that technology to private companies, law enforcement and the military.

In other words, what began in 2013 as another cloud storage app has pivoted toward a far more lucrative business known as Ever AI — without telling the app’s millions of users.

From the sound of things, Ever decided two-and-a-half years ago to switch its business strategy – by embracing facial recognition and exploiting the 13 billion images its users had entrusted it with. But what it doesn’t seem to have done is clearly communicate that change of path with its millions of users, and given them the choice as to whether they wished to opt in or not.

Ever isn’t, of course, the only tech company to be building facial recognition technology – but its use of the private photos of customers who have not given their explicit, informed consent to augment its dataset and improve its algorithms is really heinous behaviour.

Whenever you’re offered a product for free, ask yourself how the company is planning to make money. Are they hoping to upgrade you to a paid account, going to bombard you with ads, or exploit your data in some other fashion?

When I can’t determine what the company’s plan is, or am uncomfortable with the answer I dig out, I know that I feel a whole lot happier paying for services.

When you pay for a service you have some power. When you pay nothing, the company couldn’t care less about you.


Graham Cluley is an award-winning keynote speaker who has given presentations around the world about cybersecurity, hackers, and online privacy. A veteran of the computer security industry since the early 1990s, he wrote the first ever version of Dr Solomon's Anti-Virus Toolkit for Windows, makes regular media appearances, and is the co-host of the popular "Smashing Security" podcast. Follow him on Twitter, Mastodon, Threads, Bluesky, or drop him an email.

One comment on “Ever app users uploaded billions of photos, unaware they were being used to build a facial recognition system”

  1. M. Dufrenoy

    "When you pay for a service you have some power. When you pay nothing, the company couldn’t care less about you."

    I take it the "power" to which you refer is to leave for another paid (or free service), albeit just as "heinous" (your words). The difference is the quality of marketing material to cover their crimes.

What do you think? Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.