The organization also submitted its own recommendations to the competition authority.
Reporters Without Borders (RSF), an international human rights organization, has come out in support of South Africa's competition authority, which in February published a preliminary report with the first results of the Media and Digital Platforms Market Inquiry (MDPMI).
In a report, the South African Competition Commission accused Google and Meta* of anti-competitive practices in the media market and proposed radical measures to protect the public right to accurate information. A number of the Commission's initiatives are in line with the recommendations of RSF, which has long called for global regulation of digital platforms and generative AI.
The The Commission’s preliminary report, released in late February, offers a damning assessment of how dominant digital platforms operate — highlighting Facebook’s* suppression of news content, Google’s near-total control of the search engine market, a lack of fair remuneration for content creators, and imbalanced negotiations between media organizations and AI developers. Crucially, the Commission acknowledged that journalism produces “positive externalities” for society, and that platforms benefit commercially from the presence of trustworthy news content, which helps them capture and retain user attention.
This view stands in sharp contrast to Google’s position, which has claimed the commercial value of journalism is minimal within its ecosystem.
“The South African Competition Commission has delivered an inspiring preliminary report that clearly recognizes the value of journalism in the online ecosystem. It demonstrates a deep understanding of the challenges digital news outlets face in building viable business models amid the dominance of a few unregulated actors,” said Vincent Berthier, Head of the Technology and Journalism Desk at RSF. “Its conclusions pave the way for a democratic and ambitious regulatory framework for social media and generative AI, not only in South Africa but globally.”
While RSF has welcomed the Commission’s findings, the organization also warned that tech companies may attempt to water down the legislative process — as was seen during the negotiations surrounding the European AI Act’s Code of Practice.
RSF Recommendations to the South African Competition Commission:
1. Oblige platforms to increase the visibility of reliable information sources
Major digital platforms play a central role in access to information. They therefore carry a responsibility to the world’s democracies: to ensure the visibility of trustworthy information sources. Their commercial logic must not obstruct the dissemination of quality journalism — especially on matters of significant public interest.
2. Promote outlets certified by the JTI
Platforms should prioritise visibility for sources recognised as reliable based on transparent criteria, such as the requirements outlined by the Journalism Trust Initiative (JTI). This ISO-type certification developed by RSF identifies media outlets that uphold professional and ethical standards. Promoting such systems strengthens public trust and the quality of online information.
3. Redistribute advertising revenue
Dominant platforms must share part of their advertising revenue with media outlets that produce independent, public-interest journalism. This redistribution is essential to support a diverse media ecosystem and ensure citizens’ right to quality information.
4. Ban unilateral experiments to remove press content
Platforms must not be allowed to reduce the visibility of press content without consultation — for instance, by removing it from search results or news feeds — especially in response to proposed legislation. These practices — which have already been executed by by Meta in Australia and Canada and Google in nine countries, including seven in Europe — harm media diversity and the public’s right to reliable information. In France, RSF successfully joined the union for press magazines (SEPM) in filing a lawsuit to stop such an experiment.
5. Require explicit consent for the use of journalistic content in AI systems
News outlets must have the option to choose whether their content is used in databases that train AI systems. An opt-in model must replace the current opt-out by default.
6. Require collective negotiation for fairly distributing revenue
AI developers must be required to negotiate with media outlets as a collective to define fair compensation for the use of journalistic content.
7. Require that sources be referenced and linked to
Chatbots and AI tools should only access journalistic content if they clearly redirect users to the original source via a clickable link.
8. Require platforms to promote credible and diverse sources
AI systems must be designed to prioritise content from a plurality of professional, credible sources, reflecting the diversity of opinion in democratic societies.
9. Require platforms to undergo independent risk assessments
AI developers must conduct independent assessments of the social and democratic risks posed by their products, especially regarding the reliability and diversity of the information they distribute.
10. Encourage ethical AI use in newsrooms
Media outlets should adopt the ethical principles laid out by the Charter on AI and Journalism, created by UNESCO and RSF. AI tools are a welcome means to enhance journalism — as long as it is governed by clear usage policies, public transparency, and human oversight at every stage of production.
*banned and designated as extremist in Russia
Source: RSF