March 13 2019

Facial Recognition ‘Dirty Little Secret’: Millions of Photos Scraped Without Consent

People’s faces are being used without their permission, in order to power technology that could eventually be used to surveil…

People’s faces are being used without their permission, in order to power technology that could eventually be used to surveil them, legal experts say. Facial recognition can log you into your iPhone, track criminals through crowds and identify loyal customers in stores.

The technology – which is imperfect but improving rapidly – is based on algorithms that learn how to recognize human faces and the hundreds of ways in which each one is unique.

To do this well, the algorithms must be fed hundreds of thousands of images of a diverse array of faces. Increasingly, those photos are coming from the internet, where they’re swept up by the millions without the knowledge of the people who posted them, categorized by age, gender, skin tone and dozens of other metrics, and shared with researchers at universities and companies.

As the algorithms get more advanced – meaning they are better able to identify women and people of color, a task they have historically struggled with – legal experts and civil rights advocates are sounding the alarm on researchers’ use of photos of ordinary people.

These people’s faces are being used without their consent, in order to power technology that could eventually be used to surveil them.

“This is the dirty little secret of AI training sets. Researchers often just grab whatever images are available in the wild,” said NYU School of Law professor Jason Schultz.

The latest company to enter this territory was IBM, which in January released a collection of nearly a million photos that were scraped from the photo hosting site Flickr and coded to describe the subjects’ appearance. IBM promoted the collection to researchers as a progressive step toward reducing bias in facial recognition.

But some of the photographers whose images were included in IBM’s dataset were surprised and disconcerted when NBC News told them that their photographs had been annotated with details including facial geometry and skin tone and may be used to develop facial recognition algorithms. (NBC News obtained IBM’s dataset from a source after the company declined to share it, saying it could be used only by academic or corporate research groups.)

“None of the people I photographed had any idea their images were being used in this way,” said Greg Peverill-Conti, a Boston-based public relations executive who has more than 700 photos in IBM’s collection, known as a “training dataset.”

“It seems a little sketchy that IBM can use these pictures without saying anything to anybody,” he said.

John Smith, who oversees AI research at IBM, said that the company was committed to “protecting the privacy of individuals” and “will work with anyone who requests a URL to be removed from the dataset.”

Despite IBM’s assurances that Flickr users can opt out of the database, NBC News discovered that it’s almost impossible to get photos removed. (Emphasis added. Read More)