The new online tool ‘Take It Down’ gives teens the ability to remove nude photos of themselves from certain platforms and prevent their photos from being uploaded.
The program was developed by the US National Center for Missing & Exploited Children (NCMEC) and is partly funded by Meta, Facebook’s parent company. The tool is specifically aimed at minors.
A spokesman for NCMEC told the British newspaper The Guardian that the program is intended for people who know that their nude photo is floating around on the internet, or who are afraid that the photo will end up on the internet. For example, it concerns sexually tinted photos that have been sent to a partner, or when someone is blackmailed with nude images.
How does it work?
Anyone who wants can submit images anonymously, of which the site then makes a “hash”, a kind of unique mathematical fingerprint. The nude photo or film itself will then not be uploaded.
An innovative aspect of the program is that teenagers who suspect nude images of them are circulating can now also create a preventive digital fingerprint. Those “hashes” are then placed on a blacklist against which social media can test their images. If someone then tries to post the photo on a website, the photo can be recognized and blocked.
Antigone Davis, head of Meta’s security department, tells The Guardian that the tool also works with “deepfakes”. These are artificial videos where images of someone are manipulated. Those videos can seem lifelike.
What can’t it do?
A major limitation of Take It Down is that it relies on platforms that have voluntarily joined the initiative. So far, only Facebook, Instagram, OnlyFans, Pornhub and Yubo have indicated that they are participating in the tool. On other platforms, such as TikTok and Telegram, the images cannot be removed. In addition, the blacklist does not work with platforms that send encrypted messages, such as WhatsApp and iMessage.
Even if the image has been edited, for example because a filter has been applied over it or an emoji has been added, the photo can no longer be recognized by the hash.
It is unclear how people are prevented from blacklisting legitimate images that can then no longer be shared. After all, the image or video itself is not sent, so that the NCMEC cannot check which material will be placed on the blacklist.
According to NCMEC, the program is mainly intended to tackle online child abuse and exploitation: “If your own nude images are online, it can be very traumatizing, especially for young people,” a spokesperson for the foundation told The Guardian.
It is not the first time that Meta is involved in such initiatives. In 2017, Facebook, now called Meta, attempted to develop a similar tool for adults. That did not work at the time because the company asked volunteers to send in nude images. In 2021, Meta launched the StopNCII (Stop Non-Consensual Intimate Images) program in conjunction with the UK Revenge Porn Helpline, specifically targeting the dissemination of revenge porn.
- Instagram is working on a filter against unwanted messages and nude photos