The Basics of Marketplace Content Moderation

marketplace content moderation banner

As with everything, creating an efficient content moderation strategy starts with defining what the goals and framework for the operations are.

Content moderation needs may vary wildly from marketplace to marketplace depending on audience, maturity, geolocation and a lot of other factors.

To make sure you design a content moderation strategy that fits your marketplace, here are 3 core things you should consider.

This is a guest post by Emil Andersson from Besedo.

What to moderate?

This might sound straightforward, but it’s incredibly important to understand what type of user-generated content your online marketplace hosts. For example, the content on eBay and TaskRabbit is very different.

When you know your user-generated content, you need to consider what you want to moderate. Are you moderating user profiles, listings, or both?

Then evaluate which categories you should, and can, moderate.

Ideally, you should moderate all user-generated content on your marketplace, everything from profiles and listings, to 1-to-1 chats, in all categories. Unfortunately, that’s not always possible, as financial resources or technical capabilities can hold some marketplaces back.

If that’s your case, you might need to start by only moderating a selection of high-risk categories. A clever way can be to explore which of your categories hold the largest share of refused ads on your marketplace.

Some of the common risk categories we see here at Besedo are:

  • electronics (3%)
  • fashion and accessories (3%)
  • other (typically around 5%)

Alternatively, you can evaluate the content risk levels:

content risk level 900

Here, you'll usually want to focus on moderating the high-risk content first.

How to moderate?

Before the mobile revolution, users were generally okay waiting a couple of hours before their listings went live. In the post-mobile world, however, the average human attention span has fallen to 8-seconds, making it the general rule of thumb to have near instant time-to-site.

But there are some cases where this may not be true, or where instant time-to-site isn't financially or technically viable. Have a look at your users' needs and investigate what the best time-to-site commitment is for you to acquire, retain, and convert sellers.

What time-to-site you commit to will play a large role in the structure of your moderation set up. There are many methods to moderate content, but let’s look at the benefits of two of the most traditional approaches: pre- and post- moderation.


With post-moderation, content will go live instantly. Although this might be good seller UX, as their listings are live within a moment, it also means that any type of content will go live on your site – including unwanted content such as scam, hate speech, nudity, etc.

Post-moderation can be a good option for platforms that only work with manual moderation and hosts time-sensitive content where data needs to reach the site instantly. A tip is to ensure a good flagging system in your community, not only to increase UX but also to help your manual moderation team.


Pre-moderation is a great option for online marketplaces who are serious about protecting their users from bad content and wants to ensure top user experiences each time. In general, it’s best practice to pre-moderate content on your site to make sure it adheres to your policies before it goes live.

The challenge with pre-moderation, however, can be to ensure short time-to-site. In order to accomplish this, you need to achieve a high level of automation.


Automated content moderation

Yes, in 2019, automated content moderation has reached such levels that it’s something all online marketplaces need to consider. There are two ways you can automate your moderation, either through automated filters or Machine Learning AI. But when should you use what?

When to use automated filters?

There are many use cases where automated filters are incredibly efficient. Filters can be created to catch harassment, personal details, profanities, etc. A rule of thumb is to use filters when you look to catch content that can’t be misinterpreted or are obvious scams.

Two examples where automated filters are effective is 1) to spot specific bad words, such as Nazi, or 2) separate content into different manual moderation queues, i.e. high-priced items.

When to use AI moderation?

The rest! AI moderation is the best option for online marketplaces who wants to make sure their users are safe and has a great experience on their site. A tailored AI solution meets your specific moderation needs - it is built on your data to make sure highly accurate automated decisions are taken in line with your moderation policies. Use AI moderation to catch everything from scams to poor image quality.

Automation level vs quality

The more automation you want, the less accuracy you will have. You need to consider what automation quality you want, think accuracy, precision, and recall. Find a good balance between automation level and your willingness to accept bad content. A good first step is to understand the basic concepts of AI moderation.


Take these 3 factors with you, consider them carefully when setting up your content moderation, but also make sure to weigh in other relevant parameters that can affect your unique situation.

A solid content moderation setup will help you ensure high content quality, increase user trust, and improve the overall user experience – what we refer to as the three pillars of marketplace success.

Emil Andersson

Emil Andersson

Emil is the Marketing Manager at Besedo. Besedo empowers online marketplaces to create user trust, better quality content and better user experience in the digital world. This is achieved through a combination of AI moderation, automated filters and human moderation. Find Emil on LinkedIn.