“This programme gives people an emergency option to securely and proactively submit a photo to Facebook. We then create a digital fingerprint of that image and stop it from ever being shared on our platform in the first place. After receiving positive feedback from victims and support organisations, we will expand this pilot over the coming months so more people can benefit from this option in an emergency,” Antigone Davis, Global Head of Safety

At that time CEO Mark Zuckerberg said in a post, “We’re focused on building a community that keeps people safe. That means building technology and AI tools to prevent harm. Today we’re rolling out new tools to prevent “revenge porn” from being shared on Facebook, Messenger and Instagram.”


He further added by stating, “Revenge porn is any intimate photo shared without permission. It’s wrong, it’s hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared across all of our platforms”.

Facebook launches ML, AI-based technology to detect, block revenge porn.

Facebook on Friday announced a new tool to detect revenge porn on its platforms including Instagram. The company also launched a new online resource hub to help users respond to the abuse. Facebook says its new tool is driven by machine learning and artificial intelligence which allow it to detect “near nude images or videos that are shared without permission on Facebook and Instagram.”


The company adds the tool will help detect non-consensual intimate images without anyone reporting it. If any image violates the company’s Community Standards, Facebook moderators will remove it. And in some cases, it will also disable an account for sharing the content without permission. Facebook, however, will allow users to appeal the ban if they think the company has made a mistake.