In recent times, Facebook and Instagram have been one of the fastest media of spreading “revenge porn”. In many cases, these non-consensual contents are taken down but not before a good number of persons lay their hands on it. Today, the company released a new detection technology to curb the dissemination of nude media on Facebook or Instagram. It’s statement partly reads
Gizchina News of the week
“By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram…This new detection technology is in addition to our pilot program jointly run with victim advocate organizations. This program gives people an emergency option to securely and proactively submit a photo to Facebook.”
In addition, Facebook will launch a support center called “Not Without My Consent” on its Security Center page. Regarding near-nude content, Google has long said that such photos seriously infringe on personal privacy and hurt personal feelings, especially women. At the request of the victim, Google will remove those carcass or pornographic photos from the search results.