Quality Assurance for Content Moderation

Online content moderation has been an increasingly important and debated topic, with new regulations, such as the EU’s Digital Services Act (DSA), expected to further reinforce this trend. Regulations will create more legally-binding obligations for online platforms with respect to content moderation, in order to improve users’ online well-being and the better functioning of the online world.

Challenges of Content Moderation

However, while millions of posts appear every day on social platforms, only a few hundred thousand people work in the current content moderation industry. Despite plans from platforms to recruit more moderators, the amount of work managed by each moderator remains very large: they often have to review thousands of posts every day, leaving them with a very narrow (and stressful) window to decide whether or not an online post should be removed, raising possible issues regarding the accuracy, consistency and potential fairness of a company’s content moderation and its impact on free speech. 

In addition to the very limited time to make moderation decisions, the quality of moderation can also be affected by AI tools deployed by platforms, the highly contextual nature of many online posts, and the large quantity of online content falling in the grey zone between harmful and safe. Potential biases of content moderators further exacerbate the issue. For example, some moderators might be too lenient or too strict with respect to company guidelines, and can also be impacted by how long they have been working in the day, others may be accurate on some categories of instances but lack the expertise or training on some others, while other moderators might be biased specifically towards some categories of content (e.g., culturally, politically, etc).

Importance of Quality Assurance

Ensuring the quality of content moderation is a challenge that has important implications for the proper functioning of social media and freedom of expression online. Quality assurance (QA) for content moderation is essential to ensure that the right balance between safety and freedom of expression is met in a fair and effective manner. Poor content moderation can also raise reputation, regulatory, and other business risks for online platforms, including a possible loss of users. QA becomes even more challenging and important as companies outsource content moderation to external providers – whose quality also needs to be continuously monitored. In this context, online platforms are looking for ways to monitor and improve the quality of their moderation processes. Quality can be measured using metrics such as accuracy, consistency and fairness (e.g. similar cases get similar decisions). Consistency is critical both over time for each moderator and across moderators. 

The typical quality assurance process for online content moderation is based on performing regular (for example weekly) controlled evaluations: for example, after carefully labelling a number of content items (e.g., users’ posts), managers provide them to multiple moderators, which allows to compute a score for each of them based to how they perform relative to each other as well as relative to the desired labels the company selected for these items. 

However, this common QA practice does not leverage all data available, and as the evaluations are done only once a while, one cannot detect potential QA issues real time – for example because a moderator may drift even temporarily. An important challenge related to quality and consistency evaluation is the ability to use many, if not all past decisions from all moderators, in order not to be limited by a small number of weekly test instances. Very importantly, this help get rid of additional evaluation processes entirely, while improving the reliability of the evaluation and ensuring continuous monitoring. 

Managing/Improving QA

In our study, we discuss some approaches for managing content moderation quality real time, without the need to perform regular (and costly!) tests or requiring multiple moderators to handle the same cases. We develop a new method for comparing content moderators’ performances even when there is no overlap across moderators in the content they manage (i.e., each instance is only handled by a single moderator), using the data of the moderators’ previous decisions.  To this purpose, we also discuss how to adapt crowd labelling algorithms for performing QA in content moderation – an approach that we believe can be promising to further explore. 

In one of the experiments, we study how accurately different QA methods, some of them based on crowd labelling algorithms (see report about what these methods are), perform (the y-axis measures in this case how accurately the different methods can identify the ranking of moderators based on their accuracy/performance) as moderators label (e.g., remove or not) increasingly more content (the x-axis).

To find out more about building an accurate and efficient content moderation system, contact us at info@tremau.com.

To download Improving Quality and Consistency in Single Label Content Moderation, please fill out the form below.

Tremau Policy Research Team

JOIN OUR COMMUNITY

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.

Share This Post

Further articles

DSA Compliance

What Does the DSA Mean for Your Business?

On November 16th 2022, the Digital Services Act (DSA; Regulation 2022/2065) entered into force. This regulation aims to create a safer digital environment where users, businesses, governments, and civil society can flourish. The DSA will apply to VLOPS and VLOSE by August 2023, and to all online services operating in Europe by February 2024. Acknowledging

DSA Compliance

Utopia: DSA scope in focus: I have a comments section, does the DSA apply to me?

Does having a comments section qualify my services as a hosting service? In short: Yes. Having a comments section on your platform qualifies your services, at minima, as a hosting serviceA hosting service enables individuals, companies and other service providers  to host websites, databases, applications.Within the meaning of the DSA, a hosting service  offers the

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.