New
Clarifai is recognized as a Leader in The Forrester Wave™: Computer Vision Tools, Q1 2024
August 20, 2018

Content Moderation & UGC 101: 3 Key Questions and Answers

Table of Contents:

Artificial intelligence has seemingly limitless potential to change our world. From parking driverless cars to brewing beer, there’s no corner of the world this tech can’t touch. As these capabilities transform from far off ‘what ifs’ to reality, there’s one area where AI is already making a tremendous impact: content moderation. With computer vision-powered content moderation, businesses now have the ability to protect their brands, optimize their workflows and ultimately, connect with current and potential customers and users without really having to lift a finger.

In this blog, I’ll answer three key questions about content moderation and UGC:

  1. What is content moderation?
  2. How can my business leverage content moderation?
  3. How will implementing computer vision moderation benefit my business ?

1. What is content moderation?

Simply put, content moderation is the practice of screening and approving content based on specific guidelines to guarantee its appropriateness for the end-user. From articles and videos to photos and audio clips, user-generated content (UGC) is increasing at a breakneck pace—Instagram users alone post almost 50,000 times per minute! Moderation is therefore essential to prevent unsuitable content from slipping through the cracks, protecting both your business and your users.

2. How can my business leverage computer vision moderation?

While moderation has typically been performed by teams, relying solely on staff to monitor and moderate UGC is costly, inefficient, and (as we shared in an earlier blog post) potentially a human rights issue. Fortunately, advances in computer vision (CV) now allow machines to take on more of the heavy lifting, providing CV moderation solutions that can be easily tailored to your business’ specific needs.

Consider Photobucket. To prevent users from uploading pictures containing nudity, Photobucket needed a way to review the 2 million images users upload every day. Before implementing a CV moderation solution, Photobucket’s moderation team was only able to review 1% of those new images, meaning objectionable photos were much more likely to go live on the site. Now, every uploaded image passes through their NSFW filter in real time, increasing their hit rate for flagging inappropriate images by 700% and giving them much more control over what is uploaded to the site.

3. How will a moderation solution benefit my business?

The ability to moderate more images in less time is just one of the many benefits of implementing an image moderation solution on your platform. Controlling the content your site displays protects your brand’s integrity in a variety of ways. Shielding users from NSFW images protects key business partnerships, ensuring advertisers aren’t affiliated with obscene content. In addition, moderating images in real-time impacts online marketplaces by allowing posts to go live instantly, improving not only customer experience but also driving revenue.

Using computer vision transforms moderation from an onerous task to an automated one, and as AI and machine learning evolve, so does the ability to categorize content based on a variety of criteria. Businesses that rely heavily on user-generated content should take advantage of computer vision to increase productivity, protect brand integrity, and improve the customer experience. Whether you’re looking to filter out NSFW images or screen for copyright infringement, computer vision is a force multiplier for the amounts of data your platform can process.