Connect with us

TECH

Meta-funds tool to remove sexual images of minors posted online Story-level

Avatar

Published

on

The National Center for Missing & Exploited Children (NCMEC) has announced a new platform designed to help remove sexually explicit images of minors from the Internet. Meta revealed in a blog post that she had provided seed funding to create the NCMEC’s ​​free “Take It Down” toolwhich allows users to anonymously report and remove “nude, partially nude or sexually explicit images or videos” of underage individuals found on participating platforms and block offensive content from further sharing.

Facebook and Instagram have signed on to integrate the platform, as have OnlyFans, Pornhub and Yubo. Take It Down is designed for minors to self-report images and videos of themselves; however, adults who appeared in such content when they were under the age of 18 can also use the service to report and remove it. Parents or other trusted adults can also make a report on behalf of a child.

In Frequently Asked Questions about Take It Down states that users must have the reported image or video on their device to use the service. This content is not submitted as part of the reporting process and as such remains private. Instead, the content is used to generate a hash value, a unique digital fingerprint assigned to each image and video that can then be provided to participating platforms to detect and remove it on their websites and apps, while minimizing the number of people. who see the content. actual content.

“We created this system because so many children face these desperate situations,” said Michelle DeLaune, NCMEC President and CEO. “Our hope is that kids are aware of this service and are relieved that there are tools out there to help remove the images. NCMEC is here to help.”

The Take It Down service is comparable to stopNCIIa service launched in 2021 that aims to prevent people over the age of 18 from sharing images without consent. Similarly, StopNCII uses hashes to detect and remove explicit content on Facebook, Instagram, TikTok, and Bumble.

Meta teased the new platform last November in conjunction with the launch of new privacy features for Instagram and Facebook.

In addition to announcing its collaboration with NCMEC in November of last year, Meta launched new privacy features for Instagram and Facebook that aim to protect minors who use the platforms. These include requiring teens to report accounts after blocking suspicious adults, removing the message button on teens’ Instagram accounts when viewed by adults with a history of blocking, and applying stricter privacy settings by default to Facebook users under the age of 16 (or 18 in certain countries).

Other platforms participating in the program have taken steps to prevent and remove explicit content that depicts minors. Yubo, a French social networking app, has implemented a range of AI and human-operated moderation tools that can detect sexual material depicting minors, while Pornhub allows people to directly issue a removal request for illegal or non-consensual content published on its platform.

All participating platforms have previously been criticized for failing to protect minors from sexual exploitation.

The five participating platforms have previously been criticized for failing to protect minors from sexual exploitation. TO bbc news 2021 report found kids could easily bypass OnlyFans age verification systemswhile Pornhub was sued by 34 victims of sexual exploitation the same year, alleging that the site deliberately profited from videos depicting rape, child sexual exploitation, trafficking, and other non-consensual sexual content. Yubo — described as “Tinder for teens,” has been used by predators to Contact and rape underage usersand the NCMEC estimate last year that Meta’s plan to apply end-to-end encryption to its platforms could effectively hide 70 percent of child sexual abuse material currently detected and reported on its platform.

“When technology companies implement end-to-end encryption, without built-in preventative measures to detect known child sexual abuse material, the impact on child safety is devastating.” DeLaune told the Senate Judiciary Committee earlier this month..

TO Press release for Take It Down mentions that participating platforms may use the hashes provided to detect and remove images on “unencrypted or public sites and apps,” but it’s unclear if this extends to the use of end-to-end encryption by Goal. in services like Messenger. We’ve reached out to Meta to confirm and will update this story if we hear back.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © 2023 Story Level Media.