Since Meta announced that it would replace professional fact-checkers with Community Notes in the US, this has been the subject of controversial discussion. Some fear that false reports could spread more easily. Others see it as an opportunity for more user participation and transparency. But what are Community Notes? And how do they work? That's what the first episode of our miniseries "Click by Click" is about.
What began on X (formerly Twitter) is now coming to Meta. Users of Facebook, Instagram and Threads in the US can add fact-checking notes to posts. The idea: Not central editorial teams check content, but the community itself.
How do Community Notes work?
Community Notes are a system for collective fact-checking. Users mark posts that they consider misleading and add sources - in other words, they conduct their own fact-checking. Others evaluate these indications. Only if people with different attitudes and perspectives classify them as helpful do they appear under the post. An algorithm ensures that notes that receive approval from different perspectives are particularly visible - a principle called "bridging". It is intended to make manipulation more difficult and promote objectivity.
Who can create Community Notes?
Only those who meet certain criteria are allowed to participate. You must apply to X, be registered for at least six months, and have provided a telephone number. In addition, you must not have violated the rules of use in the last six months. Accepted users first evaluate hints from others before they are allowed to write their own. Meta is currently testing a similar process in the US.
What applies to Europe?
Meta is continuing its previous fact-checking program outside the US - it will remain in place in Europe because of the Digital Services Act. This means that false reports will continue to be evaluated and contextualized by external organizations such as AFP, Reuters or CORRECTIV. These organizations follow the International Fact-Checking Network (IFCN) Code, which requires independence, transparency, and a transparent verification process. Important: Fact-checkers cannot delete posts - they can only add notes. The platform decides what users see in the feed.
Community Notes vs Fact Checks: A Comparison
Both models pursue different approaches in the fight against disinformation – and both are effective, as various studies have shown.
Here are the most important differences at a glance:
Community Notes
- User-centric: The community evaluates and creates comments.
- Diverse sources: Users decide for themselves which evidence they use.
- Different perspectives: Annotations only appear when people with different views agree.
- Fast & scalable: Many people can rate content at the same time.
- Transparency: Users can view the process directly and participate in it.
- Risk of manipulation: Higher, as organized groups can influence the system.
Fact checks
- Expert supported: Professional fact-checkers analyze content.
- Focused: Examiners rely on established scientific or journalistic sources.
- Centralized: Small teams evaluate content independently.
- Thorough & slow: Limited capacities of fact-checking organizations and thorough research require more time.
- Transparency: The fact-checkers' review process is more difficult for outsiders to understand.
- Risk of manipulation: Lower, as experts are less susceptible to direct manipulation.
René and Karsten both work in community management at Deutsche Telekom. They spend a lot of time on the different platforms every day. We asked them for their assessment from an expert's point of view: "A lot is shared on the Internet - but not everything is true. All the better that there are various ways to better classify information. Especially when working in community management, we scroll, read and comment on hundreds of posts every day. Community Notes, for example, are helpful here: fast, direct, and visible to everyone. Fact checks go deeper, take their time and provide reliable assessments. Both approaches have their strengths - who you trust more is ultimately a matter of taste. What is clear, however, is that dealing with misinformation concerns us all. We as a community and also we as a Group bear responsibility - and remain in dialogue for more digital education."
Preview: In the second part of our series "Click by Click", we will look at how algorithms shape our reality on the Internet and what is behind it.
Against hate online: For respectful and democratic coexistenceSince 2020, Telekom has been committed to a digital world in which everyone can live together according to democratic principles. The company stands for diversity and participation and are resolutely against opinion manipulation, exclusion and hate on the internet. This commitment is part of Deutsche Telekom's social responsibility. Together with strong partners, Deutsche Telekom empowers and sensitizes society to respectful interaction in the digital world. The company also promotes digital skills with numerous initiatives and offerings, such as Teachtoday.
All information on Telekom's social commitment can be found at
https://
https://
https://www.teachtoday.de/en/