Searching for Abusive ParentsFor children who have fled to refuges. Governments target the daughters and sons of political dissidents. Pedophiles follow the victims of illicit child sexual abuse material.
The online facial recognition search engine PimEyes allows anyone to search for images of children scraped from across the internet, raising a host of alarming possible uses, an Intercept investigation has found.
Often called the Google of facial recognition, PimEyes search results include images that the site labels as “potentially explicit,” which could lead to further exploitation of children at a time when The dark web has been a catalyst an explosion of images of abuse.
“There are privacy issues raised by the use of facial recognition technology writ large,” said Jeramie Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center. “But it’s particularly dangerous when we’re talking about children, when someone may use that to identify a child and to track them down.”
Over the years, many child victim advocacy organizations have advocated for police to use surveillance technology to combat trafficking. They argue that facial recognition can be used by authorities to locate victims. One child abuse prevention nonprofit, Ashton Kutcher and Demi Moore’s Thorn, has even developed its own facial recognition tool. However, searches on PimEyes are only for 30 AI-generated children’s faces yielded dozens of pages of results, showing how easily those same tools can be turned against the people they’re designed to help.
The Intercept did not search for fake faces, but it found many images of actual children taken from a wide variety of sources, including charitable groups and educational websites. PimEyes was previously came under fireYou can include photos from social media platforms. These images are no longer included in search results. Instead, the search engine pulls up images from all over the internet. Some images are from personal websites that parents have created anonymously or semi-anonymously. These sites often feature photos of their children. It is possible that these images were not intended to be seen by strangers who take street photographs of the kids.
One search for AI-generated children led to images of a Delaware real boy. A photographer had taken portraits in his home of the family on a sunny day in spring. When she posted the portraits in her online portfolio, the photographer omitted the boy’s name and other identifying details. However, a determined individual might be able to locate such information. (The photographer has not responded to inquiries to comment on this article.
Another search revealed a girl who displayed a craft project in an after-school program in Kyiv. The photo was taken right before the war. The girl was also found on another website. A second page showed her at home in spring. By then, Kyiv had been under siege and the program had gone out of reach. Teachers were giving kids craft projects to do from their kitchen tables.
A third search led to a photo of a British boy, 14 years old, that was featured in a video on the U.K. education system. The commentator gave the boy’s first name and details about the school he attended.
Still another search turned up a photo of a toddler from an American home-schooling blog, where the girl’s mother had revealed her first name and, when the family was traveling, rough whereabouts.
PimEyes was the brainchild two Polish developers, who created it in 2017 out of a whim. It reportedly passed through the hands of an anonymous owner who moved the headquarters to the Seychelles and then in December 2021 was purchased by Georgian international relations scholar Giorgi Gobronidze, who had met the site’s creators while lecturing in Poland.
In a wide-ranging video interview that stretched to nearly two hours, Gobronidze offered a vague and sometimes contradictory account of the site’s privacy protections.
PimEyes was in the process of developing better safeguards to protect children. However, he had a variety of responses when asked what these might look like. “It’s a task that was given already to our technical group, and they have to bring me a solution,” he said. “I gave them several options.”
He dismissed the argument suggesting that parents who post anonymous photographs of their children do not have privacy expectations. “Parents should be more responsible,” he said. “I have never posted a photo of my child on social media or on a public website.”
“Designed for Stalkers”
On its website, PimEyes maintains that people should only use the tool to search for their own faces, claiming that the service is “not intended for the surveillance of others and is not designed for that purpose.” But the company offers subscriptions that allow people to perform dozens of unique searches a day; the least expensive package, at $29.99 a month, offers 25 daily searches. The premium service allows users to set up alerts for upto 500 images or combinations thereof, so they can be notified when a certain face is discovered on a new site.
Gobronidze claimed that many of PimEyes’s subscribers are women and girls searching for revenge porn images of themselves, and that the site allows multiple searches so that such users can get more robust results. “With one photo, you can get one set of results, and with another photo you can get a totally different set of results, because the index combination is different on every photo,” he said. People sometimes find illicit images of themselves, and they need additional alerts to search those images. Although 500 alerts are a lot, he acknowledged that it was a lot. However, 97.7 per cent of PimEyes subscribers have a lighter account as of Thursday.
Following criticism, the company reacted by claiming that the search engines were privacy tools.
PimEyes’s previous owners marketed it as a way to pry into celebrities’ lives, the German digital rights site Netzpolitik reported2020. After being criticised, the company changed its mind and claimed that the search engine was a privacy tool. Gobronidze claimed that many of the pitfalls were being fixed under his ownership. “Previously, I can say that PimEyes was tailor-designed for stalkers, [in that] it used to crawl social media,” he said. “Once you dropped a photo, you could find the social media profiles for everyone. Now it is limited only to public searches.”
However, many people do not view PimEyes to be an aid to privacy. It has been used in the identification of adults in a number of cases. Many cases, from so-called sedition hunters working to find perpetrators after the January 6 insurrection, to users of the notorious site 4chan seeking to harass women.
Nor do PimEyes’s marketing materials suggest much concern for privacy or ethics. In a version of the “people kill people” argument favored by the U.S. gun lobby, a blog post on the site blithely alludes to its many uses: “PimEyes just provides a tool, and the user is obliged to use the tool with responsibility. Everyone can buy a hammer, and everyone can either craft with this tool, or kill.”
“These things should only be instrumentalized with the clear and knowledgeable consent of users,” said Daly Barnett, a staff technologist at the Electronic Frontier Foundation. “This is just another example of the large overarching problem within technology, surveillance-built or not. There isn’t privacy built from the get-go with it, and users have to opt out of having their privacy compromised.”
“We Do Not Want to Be a Monster Machine”
Alarmingly, search results for AI-generated kids also include images that PimEyes labels as “potentially explicit.” The backgrounds in the labeled images are blurred, and since clicking through to the source URLs could contribute to the exploitation of children, The Intercept could not confirm whether they are, in fact, explicit. Gobronidze said that the labels are assigned in part based on images’ source URLs, and that often the photos are harmless. Representatives of PimEyes will report child sexual abuse images to law enforcement if they come across them, Gobronidze said.
He gave one example to illustrate how the site can be used to uncover illegal or abusive content. A 16-year-old girl had used her parents’ credit card to open an account, Gobronidze said. She soon found revenge porn videos that had been uploaded by an ex-boyfriend — images that likely fit the legal definition of child pornography. He said that PimEyes had issued takedown notices to the websites and advised her to speak with authorities, her parents and psychologists.
Gobronidze wasn’t clear on how he would limit the abuse of children on this site. Subscribers must have a PayPal or credit card. Users upload their IDs when they ask PimEyes for takedown notices. According to him, the search engine doesn’t look at gender, age, race or ethnicity and only searches for matches in photos. “We do not want to be a monster machine,” he said, dubbing a more heavy-handed approach “Big Brother.” But at another point in the interview, he said he was planning to exclude images of children from search results. He said later that his technical team was working out how to balance these conflicting goals.
PimEyes flags people who “systematically” use the engine to search for children’s faces, he said. Users who enter only one or two children’s names are usually assumed to belong to their family. If a PimEyes representative gets suspicious, he said, they might ask a subscriber for a document like a birth certificate that would prove that a user is a parent.
When asked how a birth certificate would rule out abuse or stalking by noncustodial parents, Gobronidze said that PimEyes might instead request a signed form, similar to what parents and legal guardians provide in some countries when crossing borders with a child, to show they have any other parent’s consent. In a later email, he said that PimEyes had twice asked for “documents + verbal explanation” for people who uploaded images of children, and that the site had subsequently banned one of the accounts.
“The fact that PimEyes doesn’t have safeguards in place for children and apparently is not sure how to provide safeguards for children only underlines the risks of this kind of facial recognition service,” said Scott, of EPIC. “Participating in public, whether online or offline, should not mean subjecting yourself to privacy-invasive services like PimEyes.”
The inclusion of children’s faces in PimEyes search results underscores just how fraught the facial recognition landscape has become.
The inclusion of children’s faces in PimEyes search results underscores just how fraught the facial recognition landscape has become. Victim advocacy groups push for greater use of the technology by law enforcers for many years. Thorn, a Kutcher-Moore non-profit, developed Spotlight, a facial recognition tool that can be used to identify victims of sex trafficking. It is available to investigators who are working on cases, as well to the National Center for Missing and Exploited children. In A recent reportThe center stated that Spotlight had helped them identify more than 400 children missing from online sex trafficking ads in 2021.
Trafficking prevention is also being offered by commercial providers of facial recognition. Clearview AI, an infamous facial recognition company, has been selling its tools to the police. Identification of child victims.
These same tools can be used to target vulnerable people. Clearview AI encouraged the use of its child trafficking database after being suedAmerican Civil Liberties Union to protect domestic violence survivors and undocumented immigrant immigrants. Prostasia Foundation is a child protection organization that supports the rights of sex workers and internet freedom. You can also submitAn earlier Thorn tool could have flagged images of adult men, which led to the arrests of sex workers.
PimEyes has virtually no security and breaks long-standing privacy expectations for children and adults.
Gobronidze said that PimEyes had talked to Thorn about using its tool Safer to detect child sexual abuse material using image hashing technology — a potentially odd relationship given that PimEyes makes images of children searchable to the general public, while Thorn aims to protect children from stalkers and abusers.
“There has been one exploratory call between our Safer team and PimEyes to show how Safer helps platforms detect, report and remove CSAM,” a Thorn spokesperson said, using the acronym for child sexual abuse material. “No partnership materialized after that single call and they are not users of Safer or any tools built by Thorn.”
When asked about concerns about its facial recognition tool, Thorn sent a statement through a spokesperson. “Spotlight is a highly targeted tool that was built specifically to identify child victims of sex trafficking and is only available to law enforcement officers who investigate child sex trafficking.”
In the United States, PimEyes could run up against a 1998 law requiring the Federal Trade Commission to protect children’s online privacy. But so far, U.S. regulators have homed in on sites that store images or information, said Emma Llansó, director of the Free Expression Project at the Center for Democracy and Technology. PimEyes crawls images on other websites. “PimEyes is just scraping whatever they can get their hands on on the web and isn’t making promises to users about what it will and won’t do with that data,” Llansó said. “So it’s something of a gray area.”
Gobronidze is well aware of the difference. “We don’t store any photos,” he claimed. “We don’t have any.”
This isn’t true. PimEyes’s Privacy Policy holds that for unregistered users — anyone who uses the site without a paid account — it retains facial images, along with the “fingerprint” of a face, for 48 hours and that data from the photos indexed in results is stored for two years. A sample PimEyes search showed thumbnail images of faces — photos returned in search queries that the site has edited to blur their backgrounds. A network traffic analysis showed that those photos are hosted on a PimEyes subdomain called “collectors.”
In an email, Gobronidze said he had not previously heard or read about that subdomain and was “intrigued” to learn of it. He noted that he had forwarded the results of The Intercept’s analysis to PimEyes’s tech and data security units, adding that he could not “disclose [the] full technological cycle” because it is proprietary.
EPIC’s Scott said that he would rather not wait for regulators or courts to decide on the storage question. “Congress needs to act to not only protect our children, but all of us from the dangers of facial recognition technology,” he said. “Services like this should be banned. That’s how you should regulate it.”