“It seems like impunity is the rule of the game nowadays,” said Nobel Peace Prize laureate and journalist Maria Ressa at the opening of the Social Media Tribunal in Berlin, CCW’s initiative.
Convened by the Court of the Citizens of the World (CCW), a non-binding ‘People’s Tribunal’ designed to spotlight human rights violations was organized in March 2025. The Tribunal included judges, prosecutors, and defense attorneys from Germany, India, and the United States. Social media platforms, including but not limited to Facebook, X, TikTok, Snapchat, and Instagram, were “on trial” for their complicity in digital harms. While the court’s verdicts do not hold legal authority, the Tribunal allowed victims, families of victims, experts, and even whistleblowers to testify, highlighting the unregulated nature of the digital sphere and its role in fostering human rights abuses and eroding democracy.
Twenty-four witnesses from countries including the United States, Brazil, the United Kingdom, Myanmar, and Romania, provided testimonies that consistently pointed to media platforms’ awareness of harmful content and their failure to remove it, even when it violated their very own policies. The Tribunal ultimately concluded that social media platforms prioritized revenue over adhering to their own and international human rights standards. While not directly blaming platforms, the court found that their perpetual inaction has facilitated harmful and even criminal conduct. The verdict noted: “Large commercial enterprises dominate most platforms…these platforms often fail to serve the important interest of supporting freedom, peace, and human rights.” These failures risk democracy, endanger children and youth, deepen societal polarization, and, in cases like Myanmar, have contributed to the mass displacement of over 700,000 people.
BiH and the Digital Space
Despite no specific cases from Bosnia and Herzegovina (BiH) appearing at the Tribunal, the harms and crimes discussed, such as hate speech, disinformation, election transparency, and the endangerment of journalists, mirror the issues facing BiH and the Western Balkans more generally. Lamija Kovačević of Mediacentar Sarajevo explained that the harmful content circulating online includes hate speech that is homophobic, misogynistic, or glorifies war criminals. Disinformation, often from bots in comment sections, is also prevalent. All of these factors shape public discourse, polarization, and access to credible information, as social media is becoming an increasingly significant source of information for the public in the country.
Regarding elections and hate speech, she cited a 2024 case during local elections in Kalinovik, where a mayoral candidate was fined for spreading hate speech and glorifying war criminal Ratko Mladić. While legal amendments have clarified how election laws apply online, which is a positive step, she emphasized that implementing this can be a challenge.
With Bosnia’s complex political structure, the High Representative has led the way in instigating digital accountability measures against hate speech and genocide denial. Continuing implementation will require government coordination, which may vary across entities and cantons. Three decades after the signing of the Dayton Peace Agreement, unregulated social media platforms risk undermining institutions, deepening societal fractures, and eroding human rights standards, as was confirmed by the testimonies at the Tribunal.
Social Media Platforms and Civil Society
One such testimony came from Sophie Zhang, a former Facebook employee and whistleblower. “Facebook responds to public pressure,” Zhang said, which raised the question of what happens when these issues arise and do not attract enough attention to elicit a response from the platform.
Kovačević emphasized the need to build partnerships between tech companies and local organizations. She explained that platforms like Meta should consult with grassroots actors, especially ahead of elections, to understand risks and anticipate problems. These relationships, she argued, “should be consistent and should act preemptively.”
“While communication with platforms has improved, it varies across companies. Media organizations often struggle to know who to contact because transparency is limited,” she explained, adding that language barriers can further complicate communication.
Global Lessons from the Tribunal
The Tribunal demonstrated that Bosnia’s challenges are not unique. Witnesses detailed Facebook’s failure to act during Myanmar’s ethnic cleansing of the Rohingya. As Nay San Lwin testified, “Facebook’s failure to act allowed the violence to continue, as hate speech was promoted, and the platform became a tool for the spread of genocide.”
Families of young victims of online harm, including sextortion-driven suicides, also testified. “There are victims who do not survive this…Victims are blamed when in reality the blame belongs to the perpetrators, the platforms, and the systems that allow this to continue,” said one anonymous witness.

Locally, Kovačević observed that hate speech, denialism, and other harmful content such as online misogyny reflect the societal divisions in BiH: “Social media largely reflects the language of the public sphere, which is in a post-conflict, deeply divided society.”
To address existing shortcomings, UNESCO BiH formed a Coalition for freedom of expression and content moderation, currently led by Mediacentar Sarajevo. The alliance will enable civil society to participate in decisions about online content regulation and create de-escalation channels for the most severe cases.
Along with Mediacentar, several other local organizations in BiH and the Western Balkans are advocating and addressing these issues. In May 2025, Sarajevo’s annual POINT conference brought together activists, journalists, and civil society organizations to discuss transparency, digital security, disinformation, and platform accountability. A notable topic was the 2022 EU Digital Services Act (DSA), which strengthens regulatory guardrails. However, because BiH is still an EU candidate, it remains outside the DSA’s reach.
To conclude, Maria Ressa warned: “Big Tech has hacked our biology, changing the way we feel, act, and vote—without accountability. Without guardrails, truth dies, and democracy crumbles. We need urgent regulations to reclaim facts, restore trust, and protect our future.”