Facebook launches AI to detect and prevent sharing of 'revenge porn'

Mar 15 2019, 11:37 pm

Facebook is pushing even harder to remove non-consensual intimate images — commonly referred to as revenge porn — from their platform.

According to the social media giant, they’re launching a new detection technology and an online resource hub to help users when abuse occurs.

By using machine learning and AI, they can now “proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram.”

See also

Their older method, which they’ve used for several years, included using photo-matching technology to prevent them from being reshared.

Now, they’ll be able to find non-consensual images faster and provide better support to victims.

Antigone Davis, Global Head of Saftey, explains in a statement that “victims are afraid of retribution so they are reluctant to report the content themselves” or “they’re unaware the content has been shared.”

Davis says that a specially-trained member of their team will review the content found by the AI.

If the photo or video violates their standards, they’ll remove it and in most cases, they’ll disable the account from sharing any more intimate content.

They will offer an appeals process, however, if someone believes that the platform has made a mistake.

Facebook will also launch a victim-support hub in its Safety Centre called “Not Without My Consent.”

Victims will be able to find organizations and resources to support them, including steps they can take to remove the content and prevent it from being shared further.

Daily Hive StaffDaily Hive Staff

+ News
+ Venture
+ Tech
ADVERTISEMENT