اللغة: English
The moderation process of social media platforms, Facebook, Instagram, TikTok, X, and YouTube, regarding content on sexual and reproductive health rights has been proven to be biased. Organizations and activists in the field face multiple restrictions, particularly the removal of posts, ad rejections, and suspensions of accounts. Platforms often attribute these restrictions to violations of their community guidelines and advertising policies, especially those related to “adult” and “sexual” content.
This session is essential as the research efforts covering this topic are limited, hindering advocacy efforts to push tech companies to consider amendments to existing policies and algorithms.
This session aims to present and discuss the results of a research project that SMEX’s different units conducted over six months, “From Sharing to Silence: Assessing Social Media Suppression of SRHR Content in WANA” which investigated social media moderation process and policies, in light of credible examples that were illustrated and supported with desk analysis, survey’s input, and quotes from qualitative interviews.
The research states that platforms restrict innocuous content, even for educational, scientific, and artistic purposes. This contradicts platforms’ policies of making exceptions for such content when enforcing their policies on “sexual” and “adult” content. Interviewees accused platforms of being biased towards content from the region, mainly when posted in Arabic. These restrictions push organizations and activists to practice self-censorship and adopt tactics in content creation to evade being censored by platforms.
In WANA, discussing a range of SRHR topics remains taboo and can result in negative implications such as prosecution, imprisonment, and harassment. Platform censorship only further hinders the ability of organizations, activists, health professionals, artists, and others active in the SRHR field to freely express themselves, share and spread essential information, reach out to target audiences, and raise awareness.
As the Technology Unit Lead at SMEX, Samar is a Network and Software Engineer with over 7 years of experience in tailoring software, providing integration, implementation, and support. She manages SMEX's Digital Security Helpdesk, offering support to activists, journalists, and lawyers on Digital Security matters. Samar also leads various projects within the unit, including assessing digital wellbeing for partner organizations and establishing a Digital Forensic Lab. She is also the lead in managing and establishing Digital Safety Helpdesks, where she extensively worked on developing processes, procedures, and handling protocols for the Helpdesk, along with digital wellbeing policies.
Moussa is a digital and human rights activist. Focuses, in his work, on well-being and adapting feminist approaches.
Nate works as research coordinator for SMEX. His main areas of interest include global politics, current events, spywares and internet accssibility.
