We are always looking for exceptional individuals to join our team. If you are interested in learning more about our current openings, please browse our job listings and submit your application today. We look forward to hearing from you!
Over the last few years, the EU has implemented several regulations concerning online services like the Digital Services Act (DSA) and the Terrorist Content Online (TCO) Regulation. Find out how your business will be impacted in our Resource Centre
The Digital Services Act will apply to you depending on what type of online service you are. For most online intermediaries, it will start applying in February 2024. However, for a select group of services, known as Very Large Online Platforms, the DSA could be applicable as early as June 2023.
Content moderation tools are software applications used by content moderators to help them perform their duties. There are different types of tools available in the market such as AI based detection tools and workflow process tools that optimise and automate processes and facilitate communication between stakeholders. Many established companies build their own workflow tools to suit their own processes for lack of commercially available options.
Safety by design is an approach to developing a product with safety of users as one of its foundational cornerstones. Safety is integrated in the design and development phase to pre-empt abuse of the product later on, reducing the need for reactive remedies.
Quality Assurance (QA) in content moderation is becoming increasingly important as regulation around content moderation, like the DSA, becomes more prevalent. Besides being a best practice, QA is needed to demonstrate to authorities that a company is applying its policies, controlling its processes and, where appropriate, executing improvement mechanisms.
Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.