European Commission headquarters in Brussels, Belgium, July 2021. Shutterstock
Since the Digital Services Act (DSA) came into force for all online platforms across the European Union (EU) on February 17, I’ve been frequently refreshing the European Commission’s page to see the list of trusted flaggers.
According to Article 22 of the DSA, a trusted flagger is an entity recognised for its expertise in flagging illegal content and its independence from platforms, and “should be awarded by the Digital Services Coordinator of the Member State in which the applicant is based and recognised by all online platform providers within the scope of the DSA.”
In theory, trusted flaggers are a great idea: they provide a trusted channel for online platforms to identify and act on harmful content. Leveraging domain-specific organizational expertise could lead to improved quality reports and enforcement rates compared to the overwhelming volume of user reports that often cannot be acted on.
Indeed, the concept of trusted flaggers is not new, but they have usually only been identified by platforms. For example, YouTube runs a Priority Flagger Program that is similar to the DSA’s concept of trusted flaggers. The main difference is that it is a voluntary program under the platform’s terms. YouTube decides which entities to designate as trusted flaggers and can revoke this designation under certain conditions. In the DSA, this authority is transferred to regional Digital Service Coordinators (DSCs). The DSCs grant status to trusted flaggers and platforms no longer engage with them on their own initiative.
A publication by Ireland’s media regulator, Coimisiún na Meán, lists 60 illegal content areas that are covered by the DSA, ranging from animal harm to hate speech to copyright infringement, as identified by a subgroup of Digital Services Coordinators. Given the 27 member states and the diversity of content areas, it seems reasonable to expect that there will be thousands of eligible organisations.
Nearly three months later, the list remains surprisingly short: to date, only the Finnish Anti-Piracy Centre has been officially recognised as a trusted flagger.
The expected influx of trusted flaggers has not yet materialized. Why?
Barriers to registration: Some Member States, such as France, Belgium and Germany, have not appointed a Digital Services Coordinator, creating a bottleneck. Organisations that want to register have no way to do so at all. In other countries, the DSC is still developing its processes for registration and candidate assessment.
Resource constraints: NGOs and civil society organizations, including those that have established processes on the platform for reporting illegal content, are already working at full capacity to fulfill their core mandates. The additional mandate of credible reporters could stretch these organizations’ hands.
Incentives and disincentives: What benefits would an organization gain from becoming a trusted flagger? The role may not be closely aligned with the organization’s core mission or conducive to the organization’s reputation, and may come with downsides such as reputational risk (allegations of censorship or excessive flagging). Additionally, the operational burden and potential reputational risk may be a significant deterrent, especially if the application is rejected.
Platform independence: A credible flagger must maintain independence from online platforms. However, some NGOs and hotlines, such as INHOPE and its national members, receive funding from platforms such as Meta, TikTok, and Google. This raises the question of whether these organizations must avoid such funding to be considered independent. We recommend that a legal assessment be made in such cases. Some believe that receiving funding from a platform does not automatically mean dependency. However, this situation still raises questions about the definition of dependency and whether NGOs are willing to decline funding from platforms if it is necessary to maintain their independence.
Lack of awareness: Larger companies seem to be aware of becoming trusted flaggers, but for mid-sized and smaller companies with less awareness of digital regulations, there seems to be a lack of communication to encourage potential flaggers to register or to inform them of the importance and benefits of registering.
Regulatory support: Some EU countries will develop onboarding and training programs for credible reporters, but broader support such as financial incentives, procedural assistance, and reduced reporting burdens appears to be lacking.
Technology and automation: Implementing technological solutions similar to NCMEC’s CyberTipline could streamline the reporting process and alleviate some of the current barriers to registration and reporting.
Trusted flaggers are a key component of DSA’s strategy to make the digital space safer.
These benefit regulators by making enforcement more efficient, help platforms focus on higher quality input, and serve the public by speeding up the removal of harmful content.
The missing piece of the puzzle is a compelling set of incentives to encourage more organizations to take on this important role, such as capacity-building funding, financial incentives, and assistance with registration and reporting. Such assistance can be provided by organizations that act as intermediaries, or funnels, between trusted reporters and the platform. Jean-Christophe Le Toquin, former president of Point de Contact, a French online reporting platform, confirmed that this approach holds promise for hotlines and overburdened NGOs:
The European Commission, Member State regulators and even the platforms themselves have a shared interest in cultivating a strong pool of trusted flaggers. The potential exists. Regulators and platforms need to work closely with potential flaggers to harness it.