She decided to try something else. Scarlett next uploaded a couple pictures of herself, curious if they would lead to pictures of her relatives. They didn’t, but the results stunned her anyway: tucked under some recent images of herself and mistaken matches showing photos of Britney Spears and the pop star’s sister, Jamie Lynn, were pictures of a younger version of Scarlett. They were pictures of a dark time she didn’t totally remember —a time at age 19 when, she said, she traveled to New York and was coerced into engaging in humiliating and, at times, violent sexual acts on camera.
“I’m looking at these pictures, and all I can think is that somebody has photoshopped my face onto porn,” Scarlett told CNN Business in an interview.
What happened to her in New York in 2005 was so traumatic that she tried to take her own life in the weeks that followed, she said, and in 2018 she began going by the last name Scarlett (she officially changed her name in December 2021).
She’s worked hard to overcome past trauma. Based in Kirkland, Washington, she’s spent years working as a software engineer. She’s raising her daughter, and she’s a recovering drug addict. Since leaving Apple in late 2021 — she has pending complaints against Apple that are being investigated by the National Labor Relations Board (Apple did not respond to a request for comment) — she began a job as a senior software engineer at video game developer ControlZee in March.
But with a few clicks of a mouse, PimEyes brought back a real-life nightmare that occurred nearly two decades ago. She has since tried and failed to get all of the explicit photos removed from PimEyes’ search results, despite the site saying it would scrub images of Scarlett from results. As of this week, sexually explicit images of Scarlett could still be found via PimEyes.
Giorgi Gobronidze, who identified himself to CNN Business as the current owner and director of PimEyes (he said he bought the company from its previous owners in December), said he wishes nobody would experience what Scarlett went through, which he acknowledged as “very, very painful.”
“However, just simply saying, ‘I don’t want to see images’ or ‘I don’t want to see the problem’ doesn’t make the problem disappear,” he said. “The problem isn’t that there is a search engine that can find these photos; the problem is there are the photos and there are people who actually uploaded and did it on purpose.”
More people will “undoubtedly” have experiences like Scarlett’s, said Woodrow Hartzog, a professor of law and computer science at Northeastern University. “And we know from experience that the people who will suffer first and suffer the hardest are women and people of color and other marginalized communities for whom facial-recognition technology serves as a tool of control over.”
As Scarlett put it, “I can’t imagine the horrible pain of having that part of my life exposed not by me -— by somebody else.”
“You may find this interesting”
Scarlett’s discovery of the stash of photos on PimEyes was my fault.
Scarlett and I talked, via Twitter’s private messages, about the strangeness of this experience and the impacts of facial-recognition software.
Its images come from a range of websites, including company, media and pornography sites — the last of which PimEyes told CNN Business in 2021 that it includes so people can search online for any revenge porn in which they may unknowingly appear. PimEyes says it doesn’t scrape images from social media.
“You may find this interesting,” I wrote, introducing my article.
Shortly after that, she sent me a message: “oh no.”
Processing the results
It took Scarlett time to process what she was seeing in the results, which included images related to the forced sex acts that were posted on numerous websites.
At first, she thought it was her face pasted on someone else’s body; then, she wondered, why did she look so young? She saw one image of her face, in which she recalls she was sitting down; she recognized the shirt she was wearing in the photo, and the hair.
She sent me this photo, which appears benign without Scarlett’s context — it shows a younger version of herself, with dark brown hair parted in the center, a silvery necklace around her neck, wearing a turquoise tank top.
She saved a copy of this image and used it to conduct another search, which she said yielded dozens more explicit images, many aggregated on various websites. Some images were posted to websites devoted to torture porn, with words like “abuse,” “choke,” and “torture” in the URLs.
“And it was just like,” Scarlett said, pausing and making a kind of exploding-brain sound as she described what it was like to stare at the images. In an instant, she realized how memories she had of her brief time in New York didn’t all match up with what was in the photos.
“It’s like there’s this part of my brain that’s hiding something, and part of my brain that’s looking at something, and this other part of my brain that knows this thing to be true, and they all just collided into each other,” she said. “Like, this thing is no longer hidden from you.”
Adam Massey, a partner at CA Goldberg Law who specializes in issues such as non-consensual pornography and technology-facilitated abuse, said for many people he’s worked with it can feel like “a whole new violation” every time a victim encounters these sorts of images.
“It’s incredibly painful for people and every time it’s somewhere new it is a new jolt,” he said.
Not only did Scarlett see more clearly what had happened to her, she also knew that anyone who looked her up via PimEyes could find them. Whereas in past decades such imagery might be on DVDs or photos or VHS tapes, “it’s forever on the internet and now anybody can use facial-recognition software and find it,” she said.
Scarlett quickly upgraded her PimEyes subscription to the $80-per-month service, which helps people “manage” their search results, such as by omitting their image results from PimEyes’ public searches.
Scarlett got help in sending out DMCA takedown requests to websites hosting images she wanted taken down, she said. She isn’t the copyright owner of the images, however, and the requests were ignored.
Scarlett is angry that people don’t have the right to opt in to PimEyes. The website doesn’t require users to prove who they are before they can search for themselves, which might prevent some forms of use or abuse of the service (say, an employer looking up prospective employees or a stalker looking up victims).
Gobronidze said PimEyes operates this way because it doesn’t want to amass a large database of user information, such as photographs and personal details. It currently stores facial geometry associated with photos, but not photos, he said.
“We do not want to turn into a monster that has this huge number of people’s photography,” he said.
“It’s definitely not very accessible,” said Lucie Audibert, legal officer with London-based human rights group Privacy International.
Scarlett did opt out, saying she asked PimEyes to remove her images from its search results in mid-March.
It was about more than that, though, she said.
“We need to look at facial recognition software and how it’s being used, in terms of [how] we’re losing our anonymity but also the far-reaching consequences of losing that anonymity and letting anybody put in a picture of our face and find everywhere we’ve been on the internet or in videos,” she said.
Also in early April, Scarlett upgraded to PimEyes’ $300 “advanced” tier of service, which includes the ability to conduct a deeper web search for images of your face. That yielded yet more explicit pictures of herself.
“Your potential results containing your face are removed from our system,” the email said.
Gobronidze told CNN Business that PimEyes generally takes no more than 24 hours to approve a user’s opt-out request.
“The images will resurface”
But as of May 19, plenty of images of Scarlett — including sexually explicit ones — were still searchable via PimEyes. I know because I paid $30 for one month’s access to PimEyes and searched for images of Scarlett with her permission.
Below the results, PimEyes’s website encouraged me to pay more: “If you would like to see what results can be found using a more thorough search called Deep Search, purchase the Advanced plan,” it read, with the last four words underlined and linked to PimEyes’ pricing plans.
Next, I tried an image of Scarlett from 2005 that she instructed me to use: the one of her in a sleeveless turquoise top with a necklace on, which she said was the same image she sent to PimEyes to opt her out of its search results. The results were far more disturbing.
Alongside a handful of recent photos of Scarlett from news articles were numerous sexually explicit images that appeared to be from the same time period as the image I used to conduct the search.
This shows the opt-out process “sets people up to fight a losing battle,” Hartzog, the law professor, said, “because this is essentially like playing whack-a-mole or Sisyphus forever rolling the boulder up the hill.”
“It will never stop,” he said. “The images will resurface.”
Gobronidze acknowledged that PimEyes’ opt-out process doesn’t work how people expect. “They simply imagine that they will upload a photo and this photo will disappear from the search results,” he said.
The reality is more complicated: Even after PimEyes approves an opt-out request and blocks the URLs of similar-seeming photos, it can’t always stamp out all images of a person that have been indexed by the company. And it’s always possible that the same or similar photos of a person will pop up again as the company continuously crawls the internet.
Gobronidze said users can include multiple pictures of themselves in an opt-out request.
Scarlett still has questions, such as what PimEyes plans to do to prevent what happened to her from happening to anyone else. Gobronidze said part of this will come via making it clearer to people how to use PimEyes, and through improving its facial-recognition software so that it can better eliminate images that users don’t want to show up in the site’s search results.
“We want to ensure that these results are removed for once and all,” he said.
Scarlett, meanwhile, remains concerned about the potential for facial-recognition technology in the future.
“We need to take a hard stop and look at technology — especially this kind of technology — and say, ‘What are we doing? Are we regulating this enough?'” she said.
Harman showcases home-quality in-car technology
Improving Digital Transformation By Deploying Technology The Right Way
Duck Creek Technologies (NASDAQ:DCT) Downgraded by Bank of America