Avoid the Zombie Apocalypse with content moderation

  • Jul 19 2022
    6 month 15 days

A date with my seven-year-old nephew is a mandate every vacation he visits me. This visit too, he had been looking forward to our ‘us time’ and somehow the work and the weather had kept me procrastinating. I think he must have run out of patience when he sat on my lap and demanded to know what I was working on? I replied, “content moderation” not expecting him to understand let alone reply. With a serious tone, he replied, “Yes that is a must.” Now it was my turn to look up and I decided to interview this Gen Alpha, asking him what content moderation is? Without taking a breath, he replied, “content moderation is blocking bad stuff from being seen on videos.” It left me amazed at how simple and crisp his reply was. 

While the internet is an infinite virtual galaxy that has unfathomable possibilities there is both the good and the bad. Just to give you an example, a few months ago a friend who sells her ceramic pottery via an e-commerce platform got blocked due to some objectionable photos and lewd comments that were posted on her site. Like many others, she was unaware of the concept of content moderation. As I explained to her as well, simply put; content moderation is the process of monitoring user-generated content (UGC) with some preset guidelines that scan and remove offensive content thereby ensuring compliance to the platform norms. 

In the last few years, there has been a whopping rise of user-generated content with the availability of easy internet access and smartphones. Naturally, there has also been a rise in content that is insensitive, violent, has abusive language or hate crimes, thus needing diligent moderation. According to the Data Bridge Market Research report, the global content moderation solutions market is set to grow at a CAGR of 10.7% by 2027. Meanwhile, as found by AdWeek, 85% of users are more influenced by UGC as opposed to Professionally generated content (PGC). 

While this emphaises the urgent need for content moderation, it also raises a debate on what exactly is offensive content and what are the correct parameters to identifying such content. 

Any content, be it text, image, video, audio, or in any other form, which depicts violence, nudity, or hate speech can be termed sensitive content. The rules that determine which content is sensitive depend upon a platform’s specific requirements. If a platform wants to promote debate and discussion, its content rules may permit content that would otherwise be turned off and considered unsafe. For reference, Brut Media, a French video publisher sheds light with feature-based short or long-form original videos that primarily focus on unreported news, covering politics, entertainment, social justice and culture. The digital media company targets millennials and Gen Z with “socially conscious” news videos, without the use of algorithms; rather it thrives on sparking conversation on social media with a values-driven approach. On the contrary, Facebook has a stringent content moderation policy that is AI automated and is deployed at an ex-ante stage where a video or photograph is screened at submission.

For a platform or a business where most of the users are young, and content is mostly immediately published, for better user experience, automated content moderation is advocated as it ensures that sensitive content is not exposed to young users. It is no longer a secret that content is a core ingredient of any brand; it portrays the brand ethos and integrity. Brands are now vigilant about what goes out there that defines their values and objectivity. Convoluted content needs smart moderation. Most brands are extending to create a comprehensive and safe ecosystem that promotes self-expression without violating community guidelines, and expressing creativity without advocating harm and malice. Artificial Intelligence and Machine Learning are significantly helping in sieving large volumes of content with negligible human intervention and as my nephew stated, “keeping the bad stuff away”.

My nephew made the writing of this blog on Content Moderation an easier task with his simple innocent words, “bad guys post bad videos, they make everyone hate everyone and start a zombie apocalypse. My mama says don’t watch those videos, else you will turn into a zombie too.” Guess mums are always right, were my last thoughts as I wrapped up the blog and we headed out for our date; Excited to have a serious conversation on the zombie apocalypse and alien attack, of course with a dose of Peppa pig.