11/09/2023
Live streamed femicide in Bosnia and Herzegovina sparks calls for urgent improvement of content moderation
Following a horrific case of femicide that happened in Gradacac on August 11th, that was livestreamed on the social media platform Instagram and consequently shared on other platforms, Coalition for Freedom of Expression and Content Moderation wrote a letter to the social media company Meta, demanding an explanation on why the video was left online for more than three hours following the event and urging the company to improve its practices of content moderation. The letter was signed by 18 organisations and individuals who are members of the Coalition.
Read the entire letter below.
On August 11, 2023, a horrifying femicide happened in the town Gradacac, Bosnia and Herzegovina. A brutal murder of a woman, shot by her former partner in front of their 9 months old daughter, was streamed online on Instagram reels. The video was first online at 11.20 am, and stayed online for over three hours. During that time, the murderer killed two more people and wounded three, while continuing to upload content on Instagram. This included a live address where he confessed to killing more people.
During the time it was online, the video was seen by tens of thousands, easily accessible also by vulnerable categories like minors, or people who themselves are victims of violence. No label ever appeared on the video that would warn users that it contains extreme violence. The video was liked by hundreds and downloaded by an unknown number of people before it was taken down. By the time the Instagram profile of the perpetrator was shut down, it gained thousands of new followers. The two videos – one showing the murder of Nizama Hecimovic, another showing her killer boasting about the murders he committed – have since been re-uploaded on other video sharing and social media platforms countless times, multiplying its horrifying effects, disturbing and retraumatizing victims’ family members and the general public.
The level to which the public is appalled and traumatized by this event is unprecedented and the social media companies’ role in that can not be overlooked. The fact that such extremely violent content remained online for three hours, despite the fact that it was reported using all available mechanisms, including the platform’s users, civil society organizations from Bosnia and Herzegovina, as well as local authorities, is highly disturbing. The local authorities, that are currently investigating criminal responsibility of those who supported and cheered the murderer on social networks, have already let the public know that the response time from Meta was too long considering the severity and urgency of the situation. Furthermore, there seems to be no clear and efficient mechanism across platforms to identify and stop the spread of this content on any and all social networks. There have been several cases where the videos were repeatedly uploaded on TikTok, Telegram, YouTube, as well as Facebook and Instagram, in addition to a new type of disturbing content that uses AI to generate testimonials from both the perpetrator and the victim. We have already seen similar proliferation of content that de facto turns perpetrators into celebrities in another unprecedented case that has shaken this entire region, a mass shooting in a Belgrade elementary school, where the perpetrator – himself a minor – has gained an almost cult following on video sharing platforms like TikTok. All of this can contribute to the “contagion effect” that raises the risk of further spread of violence – something we have already witnessed and fear will happen again.
As a Coalition of civil society organizations, academic institutions, journalists’ associations, self-regulatory media bodies, media associations, activists, media and digital literacy organizations, human rights activists, critical thinkers and experts we send your the following requests:
- Meta needs to inform the public in Bosnia and Herzegovina why it took more than three hours to remove such an extreme content from its platform and what concrete steps the company will take to improve its response time in such situations in the aftermath of this case. The public and families of the victims have the right to know why the mechanisms to report problematic content have failed to provide a swift response when it was needed most. The platform’s users also need to be provided with efficient reporting mechanisms which would guarantee proportional and timely results to reduce the negative impact that this kind of content can produce.
- Social media companies where this content has been uploaded and re-uploaded need to provide insights into the resources they allocated to react to harmful content, published in languages spoken in Bosnia and Herzegovina (Bosnian, Serbian, Croatian) as well as detailed information on how the reports to harmful content are processed. We have strong reasons to believe that this situation would have been far less severe if the content had been uploaded on a different, English-speaking market, where the reaction could be expected to be faster, more systematic and more efficient. With that in mind, we believe that the public in Bosnia and Herzegovina and the region is entitled to information about actual capacities of content moderation in this region, including: a. The number and roles of local language native speakers in teams that make decision on takedowns or other actions on content created/shared in this region b. Their locations with respect to time zones that may hamper response time in Bosnia and Herzegovina and neighboring countries
- In the light of this event, we urge that moderation mechanisms are immediately improved, and that human factors contributing to content moderation are strongly reinforced. This case is a painful and alarming indicator that currently established mechanisms and allocated resources do not suffice and the reliance on automated moderation mechanisms has proven entirely ineffective in a situation where a quick and proportionate response was of utmost importance. Ineffective content moderation mechanisms; inconsistent application of Community Standards and other companies’ proclaimed policies; lack of transparency in decision-making process on moderation of harmful content – these problems have long been recognized by multiple actors from Bosnia and Herzegovina civil society and continue to enable online harassment, create security threats to individuals, and in some cases severely affect freedom of expression. The traumatizing case we recently witnessed is a long overdue alert for the situation to change and to establish context-sensitive, appropriate, consistent and transparent mechanisms to tackle these issues.
- As a Coalition committed to safeguarding freedom of expression, we demand a direct line of communication – a designated contact person working in CET zone within moderation teams of Meta, YouTube, TikTok and Telegram, which we can alert swiftly and be confident that our reports and our queries will be immediately processed and answered by qualified personnel. By understanding the above aspects of content moderation, we, as a Coalition, can make more effective contributions in reducing the impact of harmful content in similar situations.
We urge you to also consider the above requests in light of the upcoming DSA implementation, most notably its requirements of countering illegal content through effective notice-and-action mechanisms and cooperation with national authorities and trusted flaggers, as well as transparency obligations. As a European and an EU candidate country, Bosnia and Herzegovina and its citizens should be able to benefit from the same level of commitment to protecting their fundamental rights online as the rest of the continent.
Coalition on Freedom of Expression and Content Moderation in Bosnia and Herzegovina is established to foster a multi stakeholder engagement and cooperation with social media platforms, other tech companies and state institutions to ensure a free, healthy and safe online environment for all the citizens of Bosnia and Herzegovina. It is an informal, voluntary-based coalition of 18 members: civil society organizations, academics, journalists’ associations, self-regulatory bodies, human rights and media freedom activists and experts committed to safeguarding the freedom of expression and improved content moderation:
Analiziraj.ba
Balkan Investigative Network in BiH (BIRN BiH)
Balkan Investigative Regional Network (BIRN HUB)
BH Journalists Association
Center for Investigative Reporting (CIN)
CIvil Rights Defenders
Fedja Kulenovic
Foundation Infohouse Sarajevo
Lejla Huremovic
Marin Culjak
Mediacentar Sarajevo
Post-Conflict Research Center
Press Council of Bosnia and Herzegovina
Sarajevo Open Centre
University of Banja Luka
Vuk Vucetic
Youth Centre KVART
Zasto ne Association