Facebook rashes in to save Malala from Echesa’s further photoshop intimidation

Last year, there was a controversy between Kakamega Senator Cleophas Malala and former sports CS Rashid Echesa over alleged photoshop. It was alleged that Echesa photoshopped Malala while naked with a woman and shared on social media platforms.

Facebook is introducing a new AI tool which will detect and remove intimate pictures and videos posted without the subject’s consent.

Facebook user shows the Facebook App in his Phone in Nairobi On October 25,2018. Photo/Enos Teche

It claims that the machine learning tool will make sure the posts, commonly referred to as ‘revenge porn’, are taken down – saving the victim from having to report them. 

Facebook users or victims of unauthorised uploads currently have to flag the inappropriate pictures before content moderators will review them. 

The company has also suggested that users send their own intimate images to Facebook so that the service can identify any unauthorised uploads. 

Many users are reluctant to share revealing photos or videos with the social-media giant, particularly given its history of privacy failures.

This is the latest attempt to rid the platform of abusive content after coming under fire after moderators claimed they were developing post traumatic stress disorder.

Facebook is using them as ‘human filters’ for the most horrific content on the internet, according to one leading cyber expert.

The company’s new machine learning tool is designed to find and flag the pictures automatically, then send them to humans to review.

Social media sites across the board have struggled to monitor and contain  abusive content users upload, from violent threats to inappropriate photos.

The company has faced harsh criticism for allowing offensive posts to stay up too long and sometimes for removing images with artistic or historical value. 

Facebook has said it’s been working on expanding its moderation efforts, and the company hopes its new technology will help catch some inappropriate posts.

The technology, which will be used across Facebook and Instagram, is trained using pictures that Facebook has previously confirmed were revenge porn.

It recognises a ‘nearly nude’ photo, for example, a lingerie shot, coupled with derogatory text which would suggest someone uploaded the photo to embarrass or seek revenge on someone else.

Leave a Reply

Your email address will not be published. Required fields are marked *