In recent times,
Facebook and Instagram have been one of the fastest media of spreading "revenge porn". In many cases, these non-consensual contents are taken down but not before a good number of persons lay their hands on it. Today, the company released a new detection technology to curb the dissemination of nude media on
Facebook or Instagram. It's statement partly reads
"By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission on
Facebook and Instagram...This new detection technology is in addition to our
pilot program jointly run with victim advocate organizations. This program gives people an emergency option to securely and proactively submit a photo to Facebook."