What Does the DSA Mean for Your Business?

On November 16th 2022, the Digital Services Act (DSA; Regulation 2022/2065) entered into force. This regulation aims to create a safer digital environment where users, businesses, governments, and civil society can flourish. The DSA will apply to VLOPS and VLOSE by August 2023, and to all online services operating in Europe by February 2024.

Acknowledging the immense benefits the Internet has brought to humanity, the DSA builds on the central principles that what is illegal offline, should be illegal online, and that more should be done to tackle the spread of illegal and harmful content online, such as terrorist content and child sexual abuse material. 

Who does the DSA apply to? 

The DSA applies proportional rules for online intermediary services, depending on the size and type of services an organization offers. The DSA defines four categories of services:

  1. Providers of intermediary services – Examples include internet service providers, content distribution networks, DNS services, VOIP services, web-based messaging and e-mail services, etc.
  2. Providers of hosting services – Examples include webhosting or cloud services.
  3. Online platforms – Examples include social networks, online marketplaces, app stores, online travel and accommodation websites, content sharing websites, etc. 
  4. Very large online platforms (VLOPs) and search engines (VLOSEs) – Any online platform or search engine that has more than 45 million average monthly users in the EU. 

Obligations under the DSA

Additionally, micro and small enterprises* are exempt from obligations for online platforms and VLOPs. However, they will be required to communicate the average number of active monthly users of their service to the competent authorities.

Enforcement and Implementation

Under the DSA, all EU Member States will designate a Digital Services Coordinator (DSC) – to supervise the providers of online services and the enforcement of the regulation. The DSC will have the power to carry out inspections, penalize infringements, impose fines or periodic penalty payments, as well as request the temporary restriction of the service in case of a continued or serious offense. Finally, the Commission will be the exclusive enforcer of systematic obligations on VLOPs and has the power to intervene upon DSC requests. 

Failure to comply with obligations can result in fines of 6% of the annual worldwide turnover of the preceding fiscal year. Furthermore, failure to provide complete and correct information for an inspection can result in fines of 1% of the annual income or worldwide turnover for the provider. 

Key dates for the DSA

Implications for your business

The DSA imposes a number of new obligations for online service providers and introduces hefty fines to ensure compliance. To avoid these, providers of online services must implement a number of operational changes. Most immediately, providers need to declare a single point of contact and legal representative in the EU, who can be held liable for offences. Hosting services, online platforms, and VLOPs also need to ensure they have well-designed and easy-to-use notice-and-action and complaint-handling mechanisms in place, and that they implement the appropriate tools to process law enforcement, trusted flaggers, and out-of-court dispute bodies’ requests

Additionally, online marketplaces will be subject to a specific set of rules that will impact the way they design their platform, show advertisements, and deal with traders and consumers. The protection of minors is also central to the regulation and providers will have to implement child protection measures such as age verification and other risk assessments. 

Concretely, providers of online services will have to adopt a streamlined set of processes that allow for continuous compliance, notably with obligations such as transparency reporting and independent audits

How can Tremau help you?

Tremau’s solution provides a single trust & safety content moderation platform that prioritizes compliance as a service and integrates workflow automation and other AI tools. The platform ensures that providers of online services can respect all DSA requirements while improving their key trust & safety performance metrics, protecting their brands, increasing handling capacity, as well as reducing their administrative and reporting burde

Further resources

To learn more about how the DSA impacts your business, check out a few of our resources on the topic below:

* Micro and small enterprises are those with a staff headcount of less than 50 and a turnover of less than €10 M. They are exempt from certain obligations falling upon online platforms.

Tremau Policy Research Team

Utopia: DSA scope in focus: I have a comments section, does the DSA apply to me?

Does having a comments section qualify my services as a hosting service?

In short: Yes. Having a comments section on your platform qualifies your services, at minima, as a hosting service. More precisely, a hosting service in the meaning of the DSA is a service that stores user generated content (UGC). Perhaps owing to the fact that the definition is rather simple, it also encompasses a whole host of services, which are very different in type, size and business model. The category includes webhosting companies like AWS or GoDaddy, file storage and sharing like Dropbox, marketplaces like Amazon and Alibaba, as well as social media services like Facebook. Because there are no exemptions based on size or type of the service, hosting services also include some services, which may come as a surprise, such as the comments sections of online newspapers and blogs as well as user reviews on e-commerce websites.

Does a comments section also qualify my services as an “online platform”?

Short answer: It depends.

Read the full version of the blog on Utopia.

Marketplaces & the DSA

While the Digital Services Act (DSA) does not make a single mention of the term ‘online marketplace’, do not let this ambiguity in terminology fool you. The DSA includes significant obligations which apply to all online services that allow consumers to connect with traders. Read on to find out what this entails and how to get prepared.

The importance of effective trust & safety policies on marketplaces

Online marketplaces are digital platforms that allow buyers and sellers to transact goods or services with each other, often acting as intermediaries in the transaction. This includes businesses of all shapes and sizes including Amazon, Shopify and eBay Kleinanzeigen. 

The EU, with its 450 million citizens, represents a big market — only taking Germany and France we are speaking of around 68.4 million and 51.5 million e-commerce customers, respectively. Of these, most online shoppers look for previous customers’ reviews, and 97% of online shoppers who read product reviews also check for responses from the seller. Building trust has thus become an essential part of retaining customers. After all, several studies have established that customer retention should be a top priority for businesses, even more so than customer acquisition. According to a survey, existing customers are 50% more inclined to try new products from the same platform.  

The DSA sets out numerous substantive obligations regulating the relationship between an online marketplace and their users. If implemented well, these can lead beyond strict compliance with the law to further keeping users engaged with your services. 

Enter the Digital Services Act: What will you need to do to ensure compliance?

The DSA seeks to promote a fair, transparent, and competitive ecosystem of digital platforms. It therefore retains the important principle laid down more than two decades ago in the   – a platform is not liable for the content of their users, unless they have explicit knowledge of it. This remains the case for marketplaces – they can benefit from this liability exemption as long as they act swiftly to review and remove content and products which are notified and assessed as illegal. 

Overall, the DSA requires marketplaces to comply with the same obligations as any other hosting service, online platform or social media. This means that marketplaces must have effective notice and action mechanisms allowing users to report products and pieces of content that are illegal. Further, marketplaces, as any other hosting service, need to ensure that they inform both the user that reported the product and the seller of the product where any decisions are taken to remove a listing. 

The DSA also requires significant tracking of all content moderation-related data points: all user notices, together with any other tool that is deployed for content moderation under the marketplace’s terms of service will need to appear in a transparency report. Each instance should be quantified and categorised by types of illegal content and source of detection. 

In addition to all the general rules applicable to online platforms, marketplaces have a few additional obligations. The so-called “know your business customer” obligation ensures that platforms know the traders that offer products and services by collecting and storing information which can be provided to authorities. Further, online platforms that act as marketplaces must also ensure that their interfaces are designed in a way that allows sellers to comply with product safety laws and they are obliged to conduct random checks against official databases. 

Perhaps most importantly from the perspective of users and their safety, marketplaces that become aware of illegal products or services being sold must inform the consumers about the seller and about what measures the consumer has available to seek redress. Some studies suggest that informing users of potential risks associated with products can positively impact user retention. A 2022 report found that the majority of shoppers (72%) believe that transparency, which involves providing detailed information about a product such as its ingredients and manufacturing process, is important. Additionally, almost two-thirds of shoppers (64%) stated that they would switch to a different brand that offers more comprehensive product information.

Where a marketplace has 45 million users or above, it will be designated by the Commission as a very large online platform (VLOP). This means that they will additionally have to comply with the extensive set of risk assessment and mitigation obligations for VLOPs, as well as carry out a yearly audit of their services and measures to ensure user safety. 

Rules for marketplaces are changing fast 

The ink on the DSA has yet to completely dry, while already new rules are popping up that specify how certain obligations need to be applied in the context of potentially unsafe products. For example, the EU’s General Product Safety Regulation (GPSR) imposes a three-day deadline for processing user notices related to hazardous products on online marketplaces. It further specifies that random checks have to be performed against the Safety Gate.  

What can Tremau do for you?

If you are a marketplace, you are likely already working hard to ensure buyers and sellers can effectively interact on your platform. However, the DSA obligations are extensive, and their operational impact and implementation solutions require careful assessment. Don’t know where to start or lacking internal resources to go at it alone? Tremau can help – check out our advisory services to know more or reach out to our policy experts.  

Feeling like you have a grasp of the DSA obligations already, but still looking for help to comply with the law? We have a software that could help in increasing handling capacity and improving your effectiveness in key trust and safety metrics. Reach out here for a demo of our content moderation software that will allow you to comply with the law while tracking all relevant metrics for your business. 

The DSA is coming: How will it affect your business in the EU?

What is the DSA?

On November 16th 2022, the Digital Services Act (DSA; Regulation 2022/2065) entered into force. The DSA is a landmark law setting out a set of common rules for digital services across the EU. With the new law in force, companies of all shapes and sizes will be obliged to put in place by 2024 processes and procedures around content moderation to address illegal content and protect user rights.

Who is affected?

The DSA will apply to all intermediary services operating in the European Union, in other words every company hosting or facilitating the transmission of user generated content (UGC) in the EU. This includes, for example, internet service providers, content distribution networks, DNS services, VOIP services, web-based messaging, but also hosting services such as web hosting or cloud storage, as well as online platforms such as social networks, online marketplaces, app stores, online travel and accommodation websites, online gaming and content sharing websites, etc.

The DSA has a global impact as it applies to all companies offering their services in the EU. As such, companies do not need to have any physical preshas-medium-font-sizeence in the Union – if they have any EU users or they market their services in Europe, they need to comply.

Read the full version of this article on the Utopia.

How will the DSA impact online platforms’ policies on minors?

2022 Pew Research Survey found that 95% of teenagers (aged 13-17) use YouTube and 67% use TikTok, with nearly one in three reporting near-constant use. The amount of screen time has also increased in recent years and it hovers around five and a half hours on average. 

With a greater number of underage users and increasing opportunities to create and share content, comes a greater risk of exposure to illegal and harmful content online. The EU’s landmark legislation, the Digital Services Act (DSA), responds to these challenges around child protection and sets out a number of obligations which aim to keep children safe online. 

How will the DSA change platforms’ trust and safety policies related to minors? 

The obligations addressing child protection in the DSA are spread throughout the text. On the basic level, any service provider which is directed at or used by minors and children has to make their terms of service understandable to minors. The most impacted, however, are likely to be online platforms. For example, social media, video sharing services, and many online gaming platforms, need to take measures to ensure a high level of  privacy, safety, and security of minors when considering the design of their platforms.

The broad nature of the new obligation is challenging as it gives little information or detail on what exact measures will achieve compliance and what falls short. Diving into the DSA, there are hints of what compliance could mean — for example, services should ensure that minors can easily access mechanisms referenced in the DSA such as notice and action and complaint mechanisms. They should also take measures to protect minors from content that may impair their physical, mental or moral development and provide tools that enable conditional access to such information.

Will there be guidance on compliant content moderation practices?

There is no obligation on the Commission to publish guidance on how platforms should safeguard their younger user base before the overall compliance deadline in February 2024. However, we can expect some co-regulatory measures to be in development as part of the Better Internet for Kids+ strategy. In the meantime, companies must seek out and apply existing best practices and develop their own measures in order to comply.

Future best practices on keeping children safe online will likely be developed in the risk assessment cycles of very large online platforms as well. Platforms with more than 45 million monthly active users will have to assess systemic risks related to minors and children such as risks of them being exposed to content which may harm their physical or mental health, or promote addictive behavior.

How can Tremau help you?

If you are an online platform, you are likely already working hard to ensure children are protected on your platform. However, whether your existing measures are enough to comply with the new obligations in the DSA needs careful assessment and benchmarking against best practices. 

Tech Policy Press: Knocking at the Door of Transparency: The Digital Services Act and Infrastructure Providers

Given the increasing focus on trust and safety and the responsibilities of actors across the Internet ecosystem, regulation has gradually shifted focus on transparency requirements. What are the processes that must be in place to deal with illegal content while protecting fundamental rights and due process? The Digital Services Act (“DSA”) is quite clear: if a company’s services are to be considered safe and trusted, transparency is non-negotiable.   

If there is one place in the Internet where transparency can provide some much-needed insight regarding content moderation, that would be its infrastructure. The infrastructure of the Internet is a space consisting of various actors who provide everyday services that allow users to have a seamless, reliable, and secure Internet experience; however, it generally attracts little attention because it is obscure and, predominantly, technical. Actors on this level consist of conduit, caching, and hosting services, seen in companies such as Cloudflare, Amazon Web Services, and Google Play Store, to name a few. Their operations are crucial, yet they often seem distanced from the public discourse; they are often considered inaccessible and, occasionally, unaccountable to everyday users.  

The question, therefore, is whether the DSA could help shed some light into the practices of these otherwise invisible actors. Does the DSA manage to create a consistent and predictable environment for infrastructure providers that could help alleviate some of the opaqueness with their content moderation practices? 

Read the full version of this article on the Tech Policy Press.

By: Konstantinos KomaitisLouis-Victor de FranssuAgne Kaarlep

Very Large Online Platforms: So, you have 45 million users – what now?

February 17th marked an important milestone for the EU’s landmark Digital Services Act (DSA) as all online platforms operating in Europe were required to publish their average monthly active users (MAU). Online platforms and search engines whose MAU surpass 45 million users in the EU will now be designated by the Commission as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). 

But what does this mean concretely – and what’s next for these companies? 

Four months until D(SA)-Day

So far, about 20 companies have publicly indicated having more than 45 million MAU in the EU.  A number of other platforms have also declared that they do not, at this stage, meet the fateful threshold: this could, however, change for those witnessing important growth in users over the next year, or beyond – pushing them toward the VLOP category. 

Importantly, regardless of how a platform classifies itselfit is the Commission that has the final say on whether it is to be considered as a VLOP. Eventually, a platform could still be designated as a VLOP in the coming months even if it did not meet the threshold today. For example, the Commission may request more information and also designate based on other sources if it has credible data that a service meets the threshold. 

Following designation, which will be officially communicated to the platforms by the Commission, the concerned services will have four months to complete their first risk assessment exercise. Given that the Commission is likely to move fast with designations, the first risk assessments will need to be completed as early as July 2023.

July 2023: The summer of risk assessments

As mentioned, the first DSA obligation VLOPs & VLOSEs will need to comply with is undertaking risk assessments(Article 35). Concretely, all services designated by the Commission are required to assess annually the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable impact on:

  1. the spread of illegal content;
  2. the exercise of fundamental rights;
  3. democratic processes and civic discourse; and
  4. concerns related to gender-based violence, public health, minors, and serious negative consequences to people’s physical and mental well-being.

Additionally, for all of the above-mentioned risks, platforms need to assess if these are influenced by any intentional manipulations of their service. 

This obligation will present a significant challenge given the novelty of the exercise and the substantial range of risks covered. Some of the risks will be easier to assess, for example, the dissemination of illegal content, where definitions are more readily available – yet still a complex exercise given the differences across jurisdictions. Others, such as the negative effects on fundamental rights, will be far more complicated, given their broad scope (for example risks related to freedom of speech, non-discrimination, or children’s rights). The most challenging category is likely to be the assessment of risks where the effects are evident outside the platform including, for example, impacts on democratic processes or on psychological well-being. 

In short, VLOPs and VLOSEs will need to consider risks observed on the platform, such as the spread of terrorist content or hate speech, as well as risks where the impact is seen outside the platform, such as concerns related to the psychological well-being of users. In practice, this will likely also mean constructing multiple risk scenarios to understand the effects of the interaction between platforms and users.

As for assessing how potential intentional manipulations add to the risks mentioned, the Code of Practice on Disinformation gives some good indication as to what would be expected from VLOPs and VLOSEs while undertaking their DSA risk assessment cycle. 

What happens next: Transparency, audits & access to data 

The risk assessment obligation is only the first step. Once a platform has identified and assessed its systemic risks, it will be required to draw and implement detailed mitigation plans. In addition to this annual self-assessment obligation (risk assessment results and adjustment of mitigation measures ), VLOPs will be required to undergo yearly independent audits of their risk assessment and mitigation measures taken. Where the audit report finds deficiencies, the VLOP (or VLOSE) will have only a month to set out a plan on how to address the gaps identified. Once the audit is completed, platforms will be required to make the audit results public

The verification mechanisms do not stop here – data access provisions in the DSA mean that VLOPs and VLOSEs need to, under specific conditions, provide access to data to regulators as well as third-party researchers, allowing research into the systemic risks impacting the EU. As such, the risk assessments conducted by the platforms internally as well as the mitigating measures drawn are likely to come under significant scrutiny, not only by auditors but also by researchers who may conduct independent risk assessments based on the data they receive access to. 

Below 45 million users but rapidly growing user numbers?

If a platform is on the cusp of reaching 45M MAUs in Europe but is not quite there yet, this is the time to be testing and preparing in-house as well as looking at what the existing VLOPs will do to comply, since best practices are likely to emerge. The flexibility and applicability of the risk assessment framework on a wide variety of services mean that the definition of successful compliance will also evolve. The current VLOPs are likely to set the stage and can be a helpful benchmark for those reaching the threshold at a later time.  

In the meantime, such platforms need to keep in mind that the MAU numbers reporting requirement was not a one-off obligation – it is now a recurring duty. The Commission and national Digital Services Coordinators also reserve the right to ask for updated MAU numbers or for explanations about the underlying calculations at any time. 

How can Tremau help?

If you are a very large online platform or are likely to become one in the near future, you are likely already working on charting the relevant risk areas where an assessment is required under the DSA. However, risk assessments and necessary mitigation measures can be a handful for internal teams to deploy alone. 

Tremau’s expert advisory team can help you carry out these risk assessments as well as provide support to your internal teams. Further, at Tremau we specialize in assessing your existing processes and practices to flag where the current mitigation measures fall short of the DSA requirements and best practices in order to offer best-in-class remediation plans and long-term compliance.

Feel like you need support now? Tremau can help – check out our advisory services to know more.  

DSA & You: How do you calculate average monthly active users?

The first deadline of the Digital Services Act (DSA) is approaching fast, all online platforms and search engines have only one more month to publish their Monthly Active User (MAU) numbers.

Getting this number right and keeping it updated is key: The MAU will be used by the Commission to determine which services qualify as very large online platforms (VLOPs) or very large online search engines (VLOSE). If this number surpasses the threshold of 45 million average MAU, your service will be classified as a VLOP or VLOSE which means the DSA´s substantial obligations will be applicable to you as early as from July 2023. 

Will you have to publish your monthly active users and how?

If you are an online platform and you are offering your services in the EU, you will have to publish your monthly active user count on a publicly available section of your website. 

An online platform is a hosting service that makes information available to the public. Offering services in the EUmeans that your service has users in one or more EU countries or that you present your service in a language or a currency generally used in the EU or even simply making your service available in the app stores of EU countries. 

What is a ‘recipient of a service’?

According to the Regulation, ‘recipients of the service’ includes business users, consumers, and all other users. This means that advertisers and traders (anyone selling their product/services online) will also be considered recipients, along with regular users. Crucially, a user does not have to be registered on an online service to count as a ‘recipient’.

Who is an active recipient?

An active recipient is someone who has engaged with an online service. ‘Engaging’ can mean asking a service to host information or viewing information on the online interface of the service. This can be as simple as uploading a photo or seeing a post that someone shared on social media. 

How will active recipients be calculated?

The average MAU will be calculated as an average over a period of 6 months.  This figure should represent the users engaging with the platform at least once in a given time period. This means that the recipient has interacted with content, provided content, asked to store content, viewed content, or searched for content (in the case of a search engine). They do not have to be a registered user of the platform; as long as they are looking at content on its online interface, they will be counted as an active recipient. It is worth noting that these obligations do not impose additional tracking of individuals online.

Examples for different types of online platforms can include:

What can be excluded from the calculation?

The DSA clarifies that only unique recipients of the service should be counted. This leads to three important exemptions to keep in mind when counting your MAU:

Will the Commission publish a methodology for counting monthly active users? Can I wait until they have?

No, the Commission has not indicated that they will publish any additional guidance on counting monthly active users ahead of the 17th of February 2023 deadline. The calculations will therefore need to be made based on the available information above. 

How can Tremau help?

If you are an online platform, reach out if you need help in calculating your monthly active users ahead of the 17th of February. With only 13 months to comply, contact us at info@tremau.com to explore our full list of advisory services to ensure that you will be ready for all the obligations kicking in early 2024.

Tech Policy Press: Can Mastodon Survive Europe’s Digital Services Act?

It has been around two weeks since Elon Musk, the world’s richest man, acquired Twitter and, already, increasing fears of what this means for free speech on the microblogging platform have begun to proliferate. With Musk firing some of Twitter’s key personnel, including Twitter’s legal chief Vijaya Gadde and terminating contracts with outsourced content moderators, many users are looking for an alternative.

A substantial number are migrating to the ‘fediverse,’ and specifically to Mastodon, a similar microblogging platform that has been called “Twitter, with the underlying architecture of email”. Mastodon’s decentralization raises substantial questions about how existing regulatory regimes, such as Europe’s Digital Services Act (DSA), will apply. 

Read the full version of this article on the Tech Policy Press.

By: Konstantinos Komaitis, Louis-Victor de Franssu

Picture: Flickr / Marco Verch Professional Photographer

An Overview of Transparency Reporting

The Internet has created enormous potential for free, democratic, and open exchanges. Yet, some have abused this incredible tool to propagate harmful and illegal content online. This has resulted in a growing interest in holding big tech platforms accountable for the way in which they moderate content on their services. This may include, but is not limited to, openly sharing information on data collection and access, as well as removal requests. Civil society has been key in putting pressure on big technology companies to be more transparent, which has resulted in the popularity of “transparency reports”. 

Transparency is critical for the well-functioning of democratic and safe exchanges, and a requirement for fair processes, so it is not surprising that this is also becoming a centerpiece of upcoming regulations for online platforms. 

In 2010, Google became the first internet company to publish a formal transparency report, a practice that became more widely adopted by 2013, amidst growing concerns including about risks of government surveillance. Since then, a number of principles and frameworks – ranging from civil society initiatives to government policies – have been adopted around transparency reporting. Today, most frameworks target transparency about the moderation of Terrorist and Violent Extremist Content (TVEC) or Child Sexual Abuse Material online (CSAM); however, the upcoming regulations, such as the European Digital Services Act, expand transparency reporting to cover all a company’s content moderation process. 

Overview of voluntary transparency reporting frameworks 

The following table provides an overview of the main bodies and principles that guide transparency reporting today. 

TitleScopeWhat does it entail
Global Internet Forum to Counter Terrorism (GIFCT) 2017Terrorist and extremist content on online platforms• Requires its members to produce transparency reports and produces its own transparency reports.
• Through a multi-stakeholder approach, it defines the elements of meaningful transparency and holds its member tech companies accountable. 
Tech against terrorism 2017Guideline on transparency reporting on online counterterrorism efforts (targeted at Governments & small online service providers)• Asks governments to detail the processes and systems they use to discover, report, and store terrorist content and activity, and what redress mechanisms they provide.
• Provides transparency reporting guidelines for tech companies and advise on community guidelines enforcement and methods to increase transparency around content moderation processes.
Tech against terrorism 2017Terrorist and extremist content on online platforms
Santa Clara Principles 2018Targeted at online service providers that do content moderation• Recommendations for steps that companies engaged in content moderation should take to provide meaningful due process to impacted stakeholders
• Better ensure that the enforcement of their content guidelines is fair, unbiased, proportional, and respectful of users’ rights.
• Sets out foundational and operational principles as well as implementation mechanisms. 
EU Code of Conduct on Countering Illegal Hatespeech Online 2019Targeted at voluntary industry signatories • The Code of Conduct was created in 2016 in cooperation with tech companies to respond to xenophobia and racism online.
• Signatories commit to proving transparency reports and ensuring removal requests for illegal content are dealt with in less than 24 hours. 
Centre for democracy and technology 2021A framework for policymakers • Focuses on user’s speech, access to information, and privacy from government surveillance.
OECD Voluntary Transparency Reporting Framework 2022Terrorist and violent extremist content (TVEC) on platforms• A response to the problem of a variety of different frameworks, definitions, and stakes recognized in other transparency reports.
• Sets a standard for baseline transparency on TVEC. 
• Launched a portal for submitting and accessing standardized transparency reports from online services.
Tech Coalition 2022Targeted at voluntary industry signatories for CSAM• TRUST is a voluntary industry framework for transparency reporting that focuses on child sexual exploitation and abuse online.
•  It takes into account the variety of digital services in this environment as well as differences in company size and maturity. 
EU Code of Practice on Disinformation 2022Targeted at voluntary industry signatories • Created in 2018 and updated in 2022, this Code addresses disinformation, specifically in the context of Covid-19 and the war on Ukraine. 
• Requests platforms to provide monthly reports on their efforts to promote authoritative data, improve users’ awareness, and limit disinformation and false advertising; it also sets up a Transparency Centre and a Task Force to oversee the implementation of the code and keep it future-proof.  

Regulations on transparency reporting

Aside from frameworks from civil society groups and voluntary codes created in cooperation with the industry, many governments have (or are in the process of) passing laws around online hate speech that encourage transparency reporting. As mentioned above, the DSA requires all online intermediaries to provide transparency reports, the details of which vary according to the type of service. The Platform Transparency and Accountability Act in the US also aims to address this growing issue and implement transparency legislation. Similarly, the proposed American Digital Services Oversight and Safety Act of 2022 sets out transparency reporting obligations for content moderation.

Implications for online service providers

With the increasing demand for accountability and transparency from online platforms as well as governments, it is not surprising that numerous frameworks for transparency reporting have come up. Despite the variations, at its core, transparency reporting entails a clear and consistent account being kept of requests for removal or restriction of content. 

Conclusion

To ensure alignment with industry best practices and compliance with regulatory requirements for transparency, companies will need new processes and tools that are effective at handling and organizing large volumes of content moderation activities, and which are continuously aligned with rapidly evolving expectations and requirements.  Concretely, this means having the ability to track all actions taken on user content, all notices coming from every potential source, and even further, track all complaints about any content moderation decisions taken by the online service. Streamlining and unifying these workflows will be crucial for all players to remain compliant and ensure the trust of their users.

To find out more, contact us at info@tremau.com.

Tremau Policy Research Team

Join our community

Stay ahead of the curve – sign up to receive the latest policy and tech advice impacting your business.