2024-06-13 –, Room D - Water Studio
In 2022, ECNL and SocietyInside developed a framework for meaningfully engaging external
stakeholders, especially civil society and affected communities, in developing and using AI
systems as well as assessing their impacts on human rights. Our framework for meaningful
engagement (FME) is a valuable tool for digital platforms not only for including external
stakeholders as they build and deploy AI systems internally, but also within their broader
human rights due diligence and compliance obligations.
The goal of our collaboration is to pilot the relevance and usefulness of the FME by applying
it practically to Discord’s real-world use of AI. Discord is a voice, video, and text chat app
that's used by tens of millions of people ages 13+ to talk and hang out with their communities
and friends, they create reliable tech for staying close. The pilot will focus on the needs of
Discord’s Safety ML team as they build models and integrate large language models (LLMs)
in child safety content flagging, user education, and moderation.
Our goal is to ensure that LLMs are used for content governance in a way that protects and
promotes civic space and human rights.
As stakeholder engagement is at the core of this pilot, we want to utilise Mozfest’s unique
participant base to conduct a practical on-site consultation and get participant’s feedback
to the content of the key issues. We are especially excited about engaging a diverse set of
stakeholders globally, from experts in digital rights and AI to marginalised groups and those
with lived experience.
x