FTC Cracks Down on Facial Recognition Software That Doesn’t Announce Itself

On February 24, 2021, the comment period ended for a consent agreement between the Federal Trade Commission and Everalbum, Inc. (and its successor, Paravision), concerning Everalbum’s use of facial recognition technology. As Mary Hildebrand and I discussed in a recent podcast, this decision offers significant guidance – and warnings – to other organizations that incorporate facial recognition software into their operations.

Everalbum released an application called Ever that allowed users to upload photos and videos from their mobile devices, computers and social media accounts to the company’s cloud-based storage service, where they could be stored and organized. In February 2017, Everalbum introduced a new feature that used facial recognition technology to group users’ photos by the faces of the people who appear in them. Everalbum represented to all users that it would not apply its facial recognition software to a user’s photos and videos unless he or she opted-in to that functionality. However, it appears that Everalbum only gave that right to some Ever app users – specifically those users located in Illinois, Texas, Washington, and the European Union, jurisdictions where statutes and regulations prohibited the use of facial recognition technology without an individual’s consent. In other jurisdictions, where consent was not necessary, the Ever app used facial recognition software without consent, even though it represented otherwise.

This was Everalbum’s big mistake, as the FTC complaint alleges that Everalbum’s representation was “misleading for Ever mobile app users outside of Texas, Illinois, Washington, and the European Union.” This essentially tainted any data, conclusions, algorithmic developments, etc. Everalbum obtained from its facial recognition technology. And unfortunately for the company, it was relying on that data a great deal, as it “combined millions of facial images that it extracted from Ever users’ photos with facial images that Everalbum obtained from publicly available datasets” to create new datasets that could be used to further develop its facial recognition AI. Additionally, although the company represented to users that it would delete their photos and videos after users deactivated their Ever accounts, the FTC discovered that for some time Everalbum did not do so, and instead retained that content indefinitely.

In response to these allegations, the settlement requires Everalbum to delete:

  • the photos and videos of Ever users who deactivated their accounts;
  • all face embeddings—data reflecting facial features that can be used for facial recognition purposes—the company derived from the photos of Ever users who did not consent to their use; and
  • any facial recognition models or algorithms developed with Ever users’ photos or videos.

The last bullet point – deleting all the models and algorithms developed using the facial recognition technology – is the most damaging part of the settlement. Although the FTC did not fine Everalbum (which one of the commissioners objected to), it denied the company the value derived from the facial recognition technology.

There are two key takeaways for organizations that adopt facial recognition software. First, disclosure statements must be accurate. If an organization’s representations concerning its AI are incorrect, inaccurate, or dishonest, the organization endangers the results of the AI and all the value gained from it.

Second, organizations should present their accurate disclosure statements to users and obtain their informed consent. If an organization is forced to defend its use of facial recognition technology or the models and algorithms it derives from recognition technology, informed consent is a key element of that defense.

The FTC found that about 25% of the approximately 300,000 Ever users who received an opt-in option chose not to consent to the use of facial recognition technology. Although Everalbum may have lost 25% of its total dataset had the company granted all users that choice, 75% of their data and the models and algorithms from that data is better than none at all. Organizations should model the expected rate of consent among their users and adjust their plans accordingly. It is better to save some of the data than to potentially contaminate and lose everything touched by the facial recognition system.

Cameron Shilling
Cameron Shilling

Cameron is the chair of the Cybersecurity and Privacy group at McLane Middleton. In his 20 plus years as a lawyer, Cameron has managed, litigated and resolved numerous commercial matters involving data security, technology, business, and employment issues in New Hampshire, Massachusetts, New England, and around the country. Data privacy is a focus of Cameron’s practice, including creating and implementing privacy policies, terms of use agreements, information use and social media policies, advising clients about workplace privacy, social media, and consumer privacy, and handling data privacy claims asserted against companies. 

Leave a Reply