"A feminist dictionary in AI"
Anastasia Karagianni;
Discussion
In this session the deliverable of the project 'A feminist dictionary in AI' of the Trustworthy Artificial Intelligence working group will be presented. Specifically, the dictionary toolkit with gender-inclusive terms and concepts that can be used as a reference in the AI era will be introduced to the participants. As such, participants with both legal and technocal background will be engaged to the session's case scenario, the design and training of a gender- inclusive algorithm.
More prticularly, participants will be splitted in two break- out rooms (the legal and technical break- out room retrospectively). Every group will have to work further on this project by providing their feedback from their perspective (e.g. what term/concept is missing, what can be explained further etc). This part will last 20 to 30 minutes. After that, all the participants will meet at the plenary session and will have to collaborate and implement the gender equality and non- discrimination principles on a specific case scenario, the design and training of a gender- inclusive algorithm, by using the dictionary. They will have to find out ways to overcome any difficulties during this process. In this way, any legal gaps between legal theory and practice will be bridged.
"Africa Mradi Town Hall"
Noémie Hailu, Chenai Chair, Kofi Yeboah, Roselyn Odoyo, Shandukani Mulaudzi;
Discussion
This session will include Africa Mradi program colleagues who will briefly share their work and the goals of the Mradi project. We will then invite the community to ask questions and insights they have on the overall project focus and overall comments around the innovation ecosystem in Africa.
"African Women in Artificial Intelligence Project"
Favour Borokini;
Discussion
Today, conversations about the significance of Artificial Intelligence have expanded from a narrow focus on its importance to the economy to the potentially negative impact of the technology on social justice and equality.
In Africa, a major aspect of the conversation involves the exclusion of women in the development of the AI and the resulting impact on the lives of African women. The absence of alternative voices in the AI industry dominated by white men, also speaks to the problem of exclusion and the deprioritisation of the needs of African women. The absence of proximity means that products and services being created through Artificial Intelligence do not centre the (pressing) needs of African women, since they are not represented in the decision making spaces.
The goal of this session is to further discussion on Pollicy’s African Women in Artificial Intelligence Project (AWiAIP) which launched in October 2021 to spark conversation about Artificial Intelligence and Gender Equality in Africa.
The project aims at initiating conversations with AI hub leaders and innovators from the private sector, academia, and civil society organizations across Africa as well as gender equality leaders on the continent.
Through this project we plan to provide insight through our research on the state of women participation in the African Artificial Intelligence development space and the beneficial and harmful effects of Artificial Intelligence integration on women.
"Afro Algorithms"
Anatola;
Social Moments
You're invited to the world premiere of "Afro Algorithms". This 3D animated short film in the Afrofuturist genre explores the topics of AI and bias. In a distant future, an artificial intelligence named Aero is inaugurated as the world’s first AI ruler. But Aero soon learns that important worldviews are missing from her databank, including the experiences of the historically marginalized and oppressed. A slate of well-known Black artists lend their voices to the film, including Robin Quivers, Hoji Fortuna, and Ava Raiin.
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
Join the cast + creators of Afro Algorithms following its world premiere by registering for the interactive panel discussion on Thursday, March 10th, 2022: https://schedule.mozillafestival.org/session/QU8PSP-1
"Afro Algorithms Panel Discussion"
Anatola, Stephanie Dinkins, Robin Quivers, Ava Raiin, Hoji Fortuna, Duru Azubuike, Nkosana Ngwenya;
Discussion
Join the cast + creators of the 3D animated short film, “Afro Algorithms”, following its world premiere at the Mozilla Festival. Moderated by superstar AI artist, Stephanie Dinkins, this interactive discussion will explore how Afrofuturism can be used to shift socio-cultural narratives, the art of 3D animation, and how we can use emerging technology such as artificial intelligence to create a future we truly want to see.
About “Afro Algorithms”: In a distant future, an artificial intelligence named Aero is inaugurated as the world’s first AI ruler. But Aero soon learns that important worldviews are missing from her databank, including the experiences of the historically marginalized and oppressed. Learn more at www.afroalgorithms.com.
Moderated by: Stephanie Dinkins
Featuring: Anatola Araba, Robin Quivers, Ava Raiin, Hoji Fortuna, Duru Azubuike, and Nkosana Ngwenya
"A Future of Internets: From Energy Literacy to Fossil-Free Imaginaries"
FIBER;
Discussion
As a kick-off of FIBER’s Reassemble Lab ‘Natural Intelligence’, this public programme will invite artists, technologists, researchers and policy makers to reflect upon the possibilities of a fossil-free internet based on ecological ethics and regenerative principles.
What role can design, creative coding and artistic research play to envision and prototype a fundamentally different way of adapting our technological demands to natural cycles? What does it mean to learn from, adapt to, and work with natural intelligence, and how to imagine, articulate, practice and connect initiatives towards fossil-free futures?
This first part of the programme will discuss the themes of Energy Literacy and Fossil-Free Imaginaries with Marloes de Valk (software artist, writer and PhD researcher at the London South Bank University), Michelle Thorne (Sustainable Internet Lead at the Mozilla Foundation and co-founder of Branch Magazine), Shayna Robinson (futurist, tech environmentalist, Program Officer for the Internet Society Foundation’s Research and Innovation Programs) and Abdelrahman (Abdo) Hassan (data science practitioner, activist and poet).
Watch the session in the embedded feed. Or go directly to https://www.youtube.com/embed/OOppK1or6RY
If you want to join the discussion via the chat, please access the event via YouTube here: https://www.youtube.com/watch?v=OOppK1or6RY
"AI Reckoning: A Dialogues & Debates Panel"
Chenai Chair, Adam Bly, Anasuya Sengupta, Sasha Costanza-Chock, Roxann Stafford;
Community Plenary
How do we refocus and realign AI as a mechanism for systems thinking and manifestation – revealing the correlation and connectedness of all things. Can AI ever be the people’s tool? Can AI hold the intelligence of abolition or are these machines beholden to learning from/within existing systematic structures? How can we invest in communities who are ensuring that the global majority thrives, that earth thrives? Is it time for an AI reckoning?
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"AI Reform: A Dialogues & Debates Panel"
Solana Larsen, Mohamad Najem, Dr. Samantha Bradshwa, Odanga Madung;
Community Plenary
In 2022, elections are planned in Lebanon, Kenya, Brazil, Philippines, France, US and beyond. What has become of elections and democracy in the age of digital platforms? Are we entering a new era of governance that will be shaped by the way platforms drive and impact campaigns, movement building, national narratives and political discourse? Will democracy survive AI?
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"Analyzing shadow-ban on TikTok: the "TikTok Observatory""
Salvatore Romano;
Extended Workshop
The workshop will show the participants how to collect and analyze data to investigate TikTok's algorithm.
In the U.S., there is evidence that TikTok has become a hub for political disinformation. Even more concerningly, there are suspicions that the algorithm is explicitly biased to align with the political interests of the PCC.
These accusations are serious, but they only rely on leaked information or anecdotal evidence, and the company denies those claims. The system remains entirely opaque, and there are currently no tools or mechanisms for civil society and regulators to collect reliable evidence to hold the platform accountable.
The TikTok Observatory is an open-source platform to monitor TikTok's recommendation algorithm behavior and personalization patterns.
It enables researchers and journalists to investigate which content is promoted or demoted on the platform, including content regarding politically sensitive issues.
With the browser extension installed, every TikTok video watched from that browser would be saved on a personal page as long as the suggested videos. Later on, the investigator can retrieve the evidence in CSV format by using the public API to compare two different profiles.
In the tutorial, we'll collect data and then display our toolchain for data analysts (based on Gephi and Python Notebook) or with a more straightforward tool such as LibreOffice.
"A New Open GLAM Program at Creative Commons"
Brigitte Vézina, Camille Françoise;
Discussion
GLAMs’ public interest mission is squarely aligned with the open access ethos. Indeed, making their collections as openly accessible, shareable, and reusable as possible is the best way for GLAMs to achieve their mission as they digitize and offer their collections online. But only a tiny fraction of the world’s GLAMs share their collections through open access initiatives. GLAMs face a host of barriers to embracing open access. At Creative Commons, we want to help GLAMs overcome these barriers, unlock universal access to knowledge and culture. Our Open GLAM Program will build a coordinated global effort to help GLAMs make the content they steward openly available and reusable for the public good.
"An Overview of Digital Rights in the Arabic-speaking Region by SMEX"
Mohamad Najem;
Discussion
During this session, Social Media Exchange (SMEX), a Beirut-based organization, will provide an overview of the digital rights landscape in the Arabic-speaking region. Over the last 5 years, we’ve witnessed a dramatic increase in threats to digital rights and freedoms as more legislations sought to shrink the space for free expression online in many repressive countries. At the same time, partnerships between big tech companies and authoritarian regimes, and the latter’s growing investment in new surveillance technology, have rendered the digital sphere dangerous for human rights defenders and civic society at large.
In 45 minutes, we will focus on a primary project in our research department, Cyrilla.org: an open database of digital rights laws from around the world with a focus on the Arab region. We will also introduce the Arab Alliance for Digital Rights that supports people’s efforts to use the internet as a means to organize and self-advocate for their online rights and freedoms. In addition, we will highlight the role of our Digital Security Helpdesk that supports members of civil society in the digital threats they face. By the end of this session, participants will leave with more insights into our advocacy work and how they can contribute.
"Assessing representation and diversity in movies and books at scale"
Julie Ricard;
Workshop
In this session, we will discuss options to build systems that allow to systematically assess representation and diversity in books and films. We work under the hypothesis that diverse representation in the authorship and content of books and films is paramount to enrich perceptions, and enhance positive participation in social and political processes. To that aim, we created Eureka (eureka.club), the first online book-film club converted to social media, and with a social impact goal: activate minds to transform the world. Eureka is designed to leverage both literature and cinema as vehicles to promote discovery and learning about key social, political and environmental topics. Hence, in this session we use Eureka as motivation and example of the broader need for the systematic assessment and promotion of representation in books and films, and to think collectively on feasible technical approaches to achieve it.
"A Zine for more Sustainable Food Systems"
Sarah Kiden, Jaewon Son, Georgine Obwana;
Extended Workshop
The workshop is inspired by ongoing research by the co-facilitators on the role of technology in contributing to food waste reduction and climate change solutions through community-based practices. Technology has been playing a role in the discussion about climate change, and particularly reducing food loss and waste. There have been various technology solutions; blockchain for tracing the supply food chain, Internet of Things (IoT), sensors and big data for providing insight into farming practices, drones for surveying land, and artificial intelligence (AI) for detecting diseases in crops.
Among diverse measures to curb climate change impacts, the food waste problem is one of the easiest measures to deal with. There are various solutions that can be implemented through simple practices and choices at a small scale (as individuals) or within our small neighborhoods by encouraging food sharing.
In this zine-making workshop, participants will work in small groups to co-create a digital zine that shares sustainable practices that can be adopted as individuals, in our homes or communities to reduce food waste. In the share back session, groups will share their ideas, which will be compiled into a zine that will be shared with all participants after the workshop.
"Blockchain Fairy Tales: What if Happily Ever After is not Guaranteed?"
Char Simpson, Lance Weiler;
Extended Workshop
The mission of the Columbia Digital Storytelling Lab is to explore new forms and functions of storytelling. In that spirit, Blockchain Fairy Tales is an immersive storytelling experience made by many. This project explores a shared visioning of the future through collective world-building and decentralized technologies. The project invites participants to step into an enchanted world powered by collaborative storytelling, play and design. In this special session at Mozilla Fest, those formerly known as the audience will become architects of the futures. Together they will craft a world that connects seven generations over space and time. Over the course of two hours their collective ideas will come to life as a series of co-created stories as they face the question: What if Happily Ever After is not guaranteed?
"Building Ancestral AI: A Dialogues & Debates Panel"
J. Bob Alotta, Toshi Reagon, L. Franklin Gilliam, Amelia Winger-Bearskin, Céline Semaan;
Community Plenary
If we lay down the burden of a future-only presumption of what technology is and can be, are we more likely to center people and the environment and develop technologies through which we can survive and thrive? How do we develop strategies for the futures we desire and/or will likely face? In this panel we invite artists, activists, movement strategists, and scientists who are deeply rooted in cultural traditions and ancestral technologies while creating robust present and future tenses.
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"Building civil society capacity for combating disinformation in Eastern Africa"
Catherine Muya;
Discussion
ARTICLE 19 recently conducted a study on content moderation in Kenya, focused on identifying the main challenges in dealing with information disorder. The study highlighted gaps among different stakeholders in identifying and dealing with disinformation. Focusing on civil society, we shall highlight these challenges as well as share ideas on how to bridge these gaps. The discussion will also feature our partners in the region to share their work, experiences, and best practices on how they have dealt with disinformation. At the end of the session, we hope participants will be able to identify helpful resources including tools that can be used in research or advocacy strategies to amplify their work on disinformation.
"Building Data, Together"
Sydette Harry;
Extended Workshop
Data sets and how data sets work are a pillar of our lives and most businesses. Mozilla Rally is part of a growing movement of researchers, technologists, activists, and others looking to create new tools for creating data sets.
Using selected case studies, workshop participants will learn principles and practices of data collection to understand how data is collected and analyzed for products and platforms. With members of the Rally team, attendees will co-design scenarios and data sets that are focused on consent and utility for participants.
"Censura, desinformación y manipulación en el ciberespacio venezolano"
David Aragort, María Virginia Marin, Adrián González;
Discussion
Analizar el sistema de censura, desinformación y manipulación que mantiene el régimen venezolano en Internet, la importancia de apoyar iniciativas y proyectos que expongan y enfrenten estas prácticas para vencer la hegemonía comunicacional y garantizar que los ciudadanos puedan acceder a información veraz y oportuna que les permita tomar decisiones responsables para contribuir con la recuperación democrática del país.
"Cities for Digital Rights Helpdesk & Governance Framework"
Milou Jansen, Paula Boet;
Workshop
During Mozfest 2020 the CC4DR hosted a session on the concept of a Digital Rights Helpdesk for Cities. One year later, this has evolved into a full planned project that kicks off in 2022, with a first version of a digital rights framework, open for feedback. The project is now also in search for advisors/experts who can assist the cities in their questions.
"Closing Circle"
Sarah Allen, Dzifa Kusenuh;
Community Plenary
Join us as we reflect on what happened over the last 5 days, and say thanks to those who helped make this event happen.
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"Co-creating and citizen science with AutSPACEs"
Georgia Aitkenhead;
Workshop
AutSPACEs is a community-led citizen science project to investigate how sensory processing differences affect the ways autistic people navigate the world. It is supported by The Alan Turing Institute, Open Humans, and Autistica. The platform is designed for autistic individuals and their supporters to share their experiences of the world so that the autistic community, allies, and researchers can understand relationships between sensory processing and autism. This session will discuss ways in which open-source technologies can support the development of a citizen science project while prioritizing the wellbeing and empowerment of its neurodiverse community.
3 members of the AutSPACEs community will run a discussion session centring around diverse inclusion and empowerment. After an overview of AutSPACEs, we will each present on an area of the project. We will then facilitate a parallel series of discussions allowing participants to follow their interests according to the following prompts:
- What is genuine empowerment?
- How can existing open-source tools be used to support inclusive co-created projects?
- What considerations need to be made to ensure that open-source projects maintain the safety, privacy, and wellbeing of neurodiverse communities?
- How do we create a welcoming, inclusive online space for neurodiverse people?
- How can we develop infrastructure that open-source platforms and their content are used in ways that the communities on the platforms would wish it to be used?
- What needs to be done to support informed consent and data ownership within neurodiverse communities interacting in online spaces?
"Co-Creating Network-Centric Resources"
Dirk Slater, Heather Leson;
Extended Workshop
When we build knowledge assets and resources specifically for communities or networks, how are we ensuring they will be useful, useable and used? During this session, we will explore co-creation methods and processes that provide our products and content with users and vitality well beyond launch.
"Code a game using MakeCode Arcade"
Pam O Brien, Chris Reina;
Workshop
In this hands-on practical workshop participants will learn how to design and code a game using the MakeCode Arcade development environment. They will be introduced to game concepts such as sprite creation and movement, games physics and collision detection. No prior knowledge of coding is required for this session with participants guided through the process of designing, developing and testing a game.
"Come and Sing at MozFest!"
James Sills;
Social Moments
Experience the joy of singing with James Sills of The Sofa Singers.
James will be leading a 60 minute session in Spatial Chat where participants will learn two classic songs with some optional harmonies/backing parts.
The session will include a quick warm-up and with everyone singing on mute, there’s no need for nerves or to feel self-conscious. You can truly sing like no-one is listening but feel part of a large choir when you see your fellow singers! No experience is necessary and participants will leve feeling empowered and energised - sing while you cook, work or just enjoy the experience as you are!
Our chosen songs by popular vote our Whitney Houston's, I Wanna Dance with Somebody and Bob Marley's One Love.
"comun.al: A History of Endless Possibility"
Alex Argüelles;
Discussion
Like many other countries in the world, Mexico was hit by severe impacts of the pandemic, which also aggravated the social, economic, and digital divides amongst its population. As gender-based violence rates arose, as LGBTQ+ people took the streets to protest, as feminist movements looked for spaces to embrace their communities, as grassroots human rights defenders oppose militarization, and as people kept looking for justice in different parts of the country, technologies became a crucial part of how we can imagine other possibilities for decentralization, community building, resilience, and transformative justice movements.
comun.al, a digital resiliency lab created in 2021, has been looking for and accompanying some of these people willing to harness community and create more nurturing, collaborative, and fair futures. Join us as we share our story. Let’s dream together of endless possibilities!
"Content moderation on the Internet: let’s play “Social Media Against Humanity”"
Federica Tortorella, Augusto Mathurin;
Workshop
The host will present about 10 controversial social media posts (real or invented). For example, a tweet with unverified information about COVID19.
These posts are presented individually: the host gives a quick introduction and then the poll starts. Each participant from the audience will have to choose one of the following 4 options:
A) Although controversial, the content is licit and should be kept online.
B) The content shouldn’t be taken down, but it should have some warning label, or its author should get a strike one.
C) The content shouldn’t be online. It must be deleted.
D) The content must be deleted and its author should be reported to the authorities.
Then, we show the results, and participants have some minutes to express their opinions about how similar content should be treated and moderated.
At the end, we will present the posts with more diverse answers, and there will be time for an open mic about the challenge of doing content moderation when every person has different opinions and views about what should be kept online and what shouldn't.
The facilitators represent different sectors and backgrounds on the Internet ecosystem, and they will express their opinions through the session. This will encourage participation in the audience. In this session everyone will have the same opportunities to turn on their mics and raise their voices.
The workshop will be conducted in English, but we will offer material and instructions to let Spanish speakers participate in the game.
"Corona Tech - Behind The Scenes"
Siham El Yassini, Reinout Hellenthal, Beryl Dreijer, Anne-Maartje Douqué, Paul Bontenbal, Theo Kremer, Coen Bergman;
Social Moments
The documentary gives an insight behind the scenes of the Municipality of Amsterdam concerning the use of digital technologies in the beginning of the Covid-19 pandemic crisis. Followed by a Q&A with the documentary maker.
"Creating a Fair Data Future"
The Data Values Project, Karen Bett, Janet McLaren, Kate Richards, Amy Leach, Tom Orrell, Leonida Mutuku, Yudhanjaya Wijeratne;
Workshop
Imagine it’s 2035, and we live in a world where everyone, everywhere has true agency over their data. Rather than corporations or governments, people determine what data is collected on them and how it is used. In this future, people are harnessing data and AI to create more inclusive and fair societies.
This workshop will explore routes to individual and collective data agency*, envisioning a world where this is realized. We will co-create a future world scenario that reflects diverse perspectives on what ideal relationships between people and data look like, and then look back to create paths from this future to the current day.
By framing this conversation around tomorrow rather than today, we aim to uncover creative ideas that match the complexity and urgency of reshaping our data agency. Through our facilitators and small group discussions, we will create an open dialogue which actively centers the perspectives of those who are excluded, marginalized, or harmed through current data practices.
Insights and ideas shared here will contribute to the Data Values Project, an ongoing consultation with people and organizations in global development to define a common agenda on data ethics, rights, and governance. We encourage participants to engage with the project beyond the workshop through contributing to ongoing focus groups, publications, and more.
*By agency, we mean that people—as individuals or as groups—shape and control what personal data is collected and how, for what purposes, and how and with whom it is shared and used.
"[Creative Media Awards] Rethinking Data Governance Models Through Arts and Media"
Kofi Yeboah;
Discussion
AI systems deployed in our societies are known to perpetuate existing societal inequalities. This is because data gathered, stored, managed, and used reflect the priorities and biases inherent in those with the power to extract and own the data. In this session, Mozilla’s new Creative Media Award awardees will highlight how their projects can solve this problem. Their projects include web-based games - exploring personal data values and the data power imbalance between online sex workers and tech companies; using ‘natural intelligence’ to make decisions rather than datasets and leveraging on animated video to explore how datasets might be ‘rewilded’ to overcome prejudicial biases. The Awardees come from Zambia, Italy, UK, Belgium, USA, South Africa, and Australia.
"Cultivating Collective Intelligence: Mis/Disinformation Edition"
Sophia Bazile;
Fringe Events
Join us for this session in cultivating collective intelligence as we scan together and collect signals about everything mis/disinformation. We'll be introduced to a scanning framework and use some tool(s)/methods that enable us to take on different perspectives, look at current and emerging trends, and peel back some layers to better understand what underlies this current milieu of "alternative facts" and "truths".
Please note: This session is best accessed from a laptop/desktop as we'll be using Zoom and a Mural as our virtual whiteboard.
Our scan hits and discussion notes will contribute to the Myths and Metaphors of Mis/Disinformation session taking place on March 9th join us to continue building!
"Data commons: rule patterns and infrastructure"
Robert Goené, Keye Wester, Socrates Schouten, Karina Meerman;
Extended Workshop
With the growing awareness of the effects of the privatisation of data, the quest for data governance models that are designed with public values in mind is thriving. The work on the governance of the commons provides a promising point of departure for the discussion on the collective management of data.
In this session, we will to focus on the operational questions concerning data commons: what are possible rules for gathering, transforming, accessing and sharing data?
Stakeholders determine the rules for a specific setting collectively and he process for determining the rules in specific circumstances is central in the study of the commons. We will leave the organisational and legal questions aside and focus on the rule patterns for the data commons. This is a more technical matter, especially when the
enforcement of thes rules in data infrastructure is concerned.
We welcome contributions from different angles: work on data access rules, but also broader work on distributed data and computation infrastructure can be highly relevant.
"Decolonized AI Futures - Space Event"
Decolonized AI Futures Wranglers;
Social Moments
The goal of the event is to foster a meaningful dialogue on how to decolonize the development, usage, and understanding of AI. AI is being deployed and adopted globally however, there needs to be more AI research by individuals and groups outside the Western world.
Some of our space facilitators will host lightning talks to raise awareness on their sessions during the festival.
We have a Miro Board too. This will be one of the outcomes for the Decolonized AI Futures space. Kindly join us on the Miro board to share your thoughts, comments, and ideas. Using your preferred format (text, images, gif, video sticker, emoji, etc), we invite you to share a response to the following prompt questions on the board.
- With AI systems embedding Western values by default, how can more diverse peoples and values change the future of AI?
- How does this impact our lives around the world?
- How can the global majority be equally included in AI policy and consumption?
"Defeating Deceptive Design: Getting Control of Our Online Lives"
Kaushalya Gupta, Ame Elliott, Carlos Iglesias, Georgia Bullen;
Discussion
Deceptive designs, or “dark patterns”, are tricks built into the interfaces of apps and websites designed to obscure or impair consumer autonomy or choice and alter decision-making to lead us towards actions we might not otherwise take. Like when companies make it easy to subscribe to a service but near-impossible to cancel. Or when you have to jump through endless hoops to tell a service not to scoop up and sell your personal data.
• Who is this session for? This session invites everyone — policymakers, civic tech practitioners, private sector, civil servants, UX designers, and the people of the internet.
• Session structure: We will share findings from our initial research on deceptive design as a baseline. We invite you to share your lived experience, discuss opportunities and barriers to change, and brainstorm interventions for addressing deceptive design globally using our human-centered approach.
• Workshop goal: We are using this tested approach to better understand the many challenges around deceptive design and co-create product and policy solutions that promote trusted design patterns.
Deceptive Designs go to the heart of people's ability to live their lives online with dignity, autonomy, and a sense of trust in the products and services. Join us to help shape the priorities of the Web Foundation’s Tech Policy Design Lab on Tackling Deceptive Design and Moving Towards Trusted Design Patterns. Learn more https://techlab.webfoundation.org/deceptive-design/overview
"Developing Agency over Media and Media Technologies"
Carolyn Malachi, Dr. Stacey Patton, and Dr. Christine McWhorter;
Discussion
Digital media technologies have been vital in the creation and dissemination of narratives. Portrayals of African Americans, their experiences, and their histories have been disregarded. Moreover, college students' immersion in these technologies has compromised their ability to critique societal structures and the media texts that portray them. Digital, financial, and media literacies among African American students are necessary, as development of these literacies enable them to create meaning in their lives and communicate their histories. This discussion will examine the process of enfranchising students of color to dismantle destructive hegemonic structures by developing agency over media and media technologies.
"Did AI Do That 2? Mitigating the potential harm of algorithmic amplification to users."
Kyla-Marie Greenway, Thiago, Jillian York, Anima Anandkumar, Ronaldo Lemos;
Discussion
This highly interactive session will focus on what can be done, from a content governance perspective, to mitigate the potential harm of algorithmic amplification of content to users. The workshop will have 4 discussion leads: Ronaldo Lemos (Oversight Board Member), Anima Anandkumar (NVDIA), Thiago Oliva (Senior Policy Officer, Oversight Board), and Jillian York (EFF). The discussion leads will introduce themselves and a key update that they want to explore in breakout groups. Each breakout group will report back at the end of the session.
Potential breakout group questions include: how to address the potential harm of algorithmic amplification, whether platforms should be more transparent to users about content that has been amplified, how to ensure the fair treatment of users and the potential threat of amplification/de-amplification to it.
"Digital Extractivism in the Global South: Where do we go from here?"
Neema Iyer;
Workshop
Digital extractivism is a term that describes how colonial practices used to extract and exploit colonies of valuable resources, is ongoing today through technology. Today digital extractivism is visible in the promotion of neoliberal policies of privatization and commodification, in partnership with governments, much to the detriment of the local economies and populations.
And just like in the past, colonial powers maximised transportation networks which offered the quickest means of removing valuable resources to metropoles, today, Big Tech corporations erect similar virtual and physical structures extracting value in the form of data and often, talent to boost foreign economies.
Today, Africans and citizens of the Global Majority are arising to the impact and significance of this sort of exploitation but more still needs to be done to spread awareness about the issues and the solutions necessary to challenge the status quo.
What?
How can we combat the current state of events, as it relates to exploitation and extractivism being carried out through technology today?
Objectives
To generate and stimulate discussion about the state of technology exploitation today citing case studies from the Digital Extractivism report.
To discuss challenges and opportunities for partnerships between civil society, the government and the private sector working on this topic.
Discuss potential policy and regulatory approaches to digitally-enabled exploitation.
To highlight successful case studies about successful resistance to digital extractivism.
"Digital Infrastructure Funder's Toolkit"
Deleted User, Ngọc Triệu;
Discussion
In this cafe-style session, we will share the Digital Infrastructure Funder's Toolkit - a project supported by the Open Collective Foundation's Digital Infrastructure Grant - with the community at MozFest. The toolkit is an implementation framework for funders of digital infrastructure with guides, programming, and models, along with narratives of funding in open source and digital infrastructure, and regional contexts about funding digital infrastructure in different parts of the world.
First, we will give an overview of the toolkit. Next, we will break out into groups with our participating global regional partners who will share out and lead discussions on digital infrastructure funding from their regional contexts.
"Digital Rights Talk - On Identity Online"
Jake Blok, Servaz van Berkum;
Extended Workshop
The topic of identity online has never been so relevant. We all live in a world in which we need to register more and more for services if we want to benefit from being online. Not only when we go online via our laptops of phones but also when we want to access public spaces and even bars and restaurants. What happens which my data? Can I control who uses my data? Can I determine who I am online and what does being anonymous online in today’s digital world actually mean? With the rise of digital identity tooling, all sorts of governmental regulations in the making, large corporations providing the most popular applications and privacy breaches everywhere the dialogue on identity online needs to be taken further. Join us during this Digital Rights Talk. You can also access the session here: https://dezwijger.nl/programma/on-identity-online?flush=true
"Digital tools to fight gender violence"
Fabiola Maurice, Martha Tudón;
Discussion
In this session we will talk about “Circulo”, a digital safe communication channel for female journalists and HRDs. Circulo was developed based on the feedback from female journalists and HRDs from Mexico, Guatemala and Honduras, professions that are very dangerous in those countries, especially for females who face physical and digital gender violence on a daily basis.
We want to share the logic and challenges behind its development. First, the social, participatory and human rights approach when talking about gender violence against women journalists and; and then, we would like to share how turned the results of the above into the construction of a technology tool designed from and for the social subject that seeks to help.
With hte participants would like to explore the problems that arose from the contexts where the app is currently implemented, the main thread and technological limitations we identified during the research process, and the features we incorporated in the app as a result of the research and the feedback provided by our partners.
Come up with alternative solutions to those challenges
Think of vulnerable groups facing gender violence or inequality in their regions and how Circulo could be incorporated into their safety protocols.
"Disinformation as a political weapon"
Julie Ricard;
Discussion
Mis- and disinformation are multifaceted phenomena, associated with technical, human and sociopolitical propensity factors. We will start the session with a brief overview of the multiplicity of such factors, and dive into the sociopolitical factors. In particular, we will discuss the weaponization of disinformation by public authorities and the emergence of “technopopulism”, with a focus on the case of Bolsonaro’s Brazil. Indeed, since the 2018 presidential elections, the Brazilian public sphere has been prone to widespread mis-disinformation. We will focus on campaigns that involve and/or have been endorsed by the highest levels of government, including the current president himself, ranging from fires in the Amazon rainforest, to Covid-19 and hydroxychloroquine. We will discuss tactics (i.e. WhatsApp), political goals and dangers ahead of the mid-year 2022 elections.
"’Do no harm’ in the algo age – the intersection of AI, peacebuilding and humanitarian expertise"
Justyna Nowak, Rakesh Bharania, Adrien Ogee;
Discussion
The potentials of emerging technologies and data science to advance peacebuilding and humanitarian action are immense, and various stakeholders have been turning to data and algorithms to advance their work. However, these methods and tools come with an extreme risk to both the privacy and lives of vulnerable populations if the data/AI is misused or used inappropriately. Although these risks exist across different contexts, the sensitive nature of conflict or violence affected areas uniquely exacerbates these challenges. In order to “do no harm,” we must be able to understand and tackle both technical and ethical issues of working with algorithms and vulnerable populations.
‘Do no harm’ and preventing the outbreak, escalation, and continuation of conflict and harm while using new tools is a huge task, and a more integrated, strategic, and coherent approach across different sectors and actors is needed to sustain peace and protect vulnerable populations. This session will offer a place for practitioners from both AI and peacebuilding fields to discuss how existing ‘do no harm’ and ‘conflict sensitivity’ mechanisms can be “upgraded” to algo age, to help ethical programing of organizations planning utilizing AI and related technologies in their programs in conflict and violence affected countries.
"Education in a Mixed AI Reality: A Dialogues & Debates Panel"
Crystal Lee, Judith Okonkwo, Maize Longboat, Angela Chen;
Community Plenary
Upcoming generations are growing up in a mixed reality where they develop a powerful command on internet platforms. Tik Tok is a great manifestation of the nuances of existence through short, catchy, layered video content. Gaming is another space where the rules of the real world are interwoven by those of the game. What is then the role of education in this cross-existence: how do these interactions shape young folks in their interactions and interrogations of technology?
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"Ethical Dilemma Cafe 2.0"
Sarah Allen, Ian Forrester, Jasmine Cox;
Workshop
We created Ethical Dilemma Cafe for MozFest in 2014, where we explored the ethical dilemmas faced everyday online but rarely imagined in our physical lives. These dilemmas manifested in a series of installations and workshops at the time.
Now in 2022, join us to re-explore & re-imagine these real life dilemmas for today, 8 years later. In our session we discuss the blend of online & offline data use, and how it can be used to surprise, provoke and pose real questions.
What assumptions does your data say about you?
Do you ever read those Terms and Conditions and the types of access you have just agreed to? Have you spared a thought how Big Tech profiles you from your online purchases or likes on a profile? Or even comprehended what advertisers predict the future you will be like?
Together we will create and map a series of ethical dilemmas we face in our everyday lives.
Post MozFest virtual, we are reopening Ethical Dilemma Cafe 2.0, live from Manchester, UK so the dispotian views built from this virtual session will inspire the dilemmas and discussions we will present in Manchester.
"Exploring Trustworthy AI in Mozilla Data-powered Products"
Shai Waldrip, Rebecca BurWei, Chelsea Troy, Oluwatomi Ladipo, Kiandra Smith, Eva Andrews;
Extended Workshop
The session is a design thinking workshop on the criteria and considerations needed to develop Trustworthy AI. How might we provide convenient and easy-to-understand features that will inform users and other stakeholders of the use case of machine learning within Mozilla product offerings?
"Facial Recognition Technology and the Spectre of Police Bias in India"
Anushka Jain, Shivangi Narayan, Nayanatara, Nikita Sonavane;
Discussion
Facial recognition technology has been touted as the solution to crime in India with massive amounts of expenditure being incurred to procure this technology by State Police departments as well as the National Crime Records Bureau in India. However, is this technology capable of solving the historical issue of bias in policing? Experts have repeatedly cautioned against buying into the myth that Artificial Intelligence is free of bias. Research has shown that contrary to the opinion that Facial Recognition Technology will help in policing, due to its inaccuracies and inherent biases, it will lead to further discrimination against marginalised communities. This is especially so for those communities that have been historically over-policed and discriminated against.
"Fair Voice: What happens when your voice becomes your ID?"
Wiebke Hutiri, Lauriane, Michaela, Casandra;
Workshop
The session takes place in SPATIAL CHAT -> Session Specific Rooms -> Fair Voice
https://spatial.chat/s/mozilla-festival-2022?guest&room=8YoigrnYdDGinEHdU5Zp
Your voice contains a lot of personal information: your identity, age, sex, health, stress, emotion, ethnicity and intoxication are just some of the things that can be deduced from your speech signal, without even knowing the content of what you said. Speaker verification is a voice-based biometric technology that identifies speakers from their voice. It is used in voice assistants, call centers, and forensics, and relies on vast amounts of personal data to train advanced machine learning algorithms. There are many parallels between the history and technologies of face recognition and speaker verification. However, while bias and discrimination are recognised as significant challenges in face recognition, little attention has been paid to them in speaker verification. This is a problem, as speaker verification systems are increasingly used in sensitive domains (e.g., financial services, pensioner proof-of-life verification, health and elderly care).
In this session we host a dialogue to introduce speaker verification and examine what is at stake when the technology is deployed at scale for voice-based authentication. We have invited panelists with diverse expertise in speech science, algorithmic activism and fairness, media and art to guide us through an interactive session with MozFest participants. The session is facilitated by the Fair EVA project of the Trustworthy AI working group. We will share what Fair EVA is doing to combat biased voice biometrics, and what you can do to contribute. Join live or watch the session later!
Panelists: Halsey Burgund, Johann Diedrick, Kathleen Siminyu, Wiebke Toussaint
"From Arts & Culture to AI and back: how can we build a two-way street?"
Randi Cecchine, Philo van Kemenade;
Discussion
From enhancing creative practices to making large cultural collections more accessible, Arts & Culture have a lot to gain from developments in Artificial Intelligence. But we often get so excited about new capabilities unleashed by AI techniques, that we forget about the important contributions artists and archivists make to the field.
What could a reciprocal relationship between Machine Learning and Cultural Heritage look like? How can those employing and applying AI technologies embrace the powerful perspectives offered by artists critiquing the status quo? What can machine learning engineers learn from the way museums, libraries and archives collect and maintain data?
This discussion will open up questions around the mutual benefits between arts & culture and AI development, raising questions not only about how galleries, libraries, archives and museums benefit from AI, but also how AI development benefits from artists and cultural heritage.
"From wooden frames to beekeeping robots - making layers of beekeeping technology accessible"
Gertje Petersen;
Workshop
As the only pollinator that can be managed at scale, honeybees play a vital role in sustainable food production, currently farmed in a wide range of systems, from traditional hollow logs to large-scale commercial beekeeping businesses in highly developed agricultural sectors. With increased public awareness of the role that insect pollinators play for food security, there has been a rise in novel technologies marketed to beekeepers, ranging from phone app using machine vision for disease detection to beehive telemetry equipment and even full beekeeping robots.
However, even the most basic of beekeeping technologies, the wooden box hive with movable frames, is not present in all parts of the world, and both prices and return on investment for novel technologies severely limit their accessibility.
Limited accessibility of beekeeping technology creates barriers for the further development of beekeeping industries especially in developing countries which would greatly benefit from the opportunities that can be created through the establishment of honey production sectors. In these environments even the establishment of beehives with movable frames (which allow non-lethal honey harvest) can greatly improve the sustainability of the industry and the quality of the product. However, open-source digital tools could greatly improve productivity, sustainability and market access for honey producer and pollination providers alike.
"Future Focus: Money and Environment"
Irini Papadimitriou, Ismail Erturk;
Discussion
Future Focus: Money and Environment is a discussion and digital debates programme at FutureEverything bringing together scholars, artists, economists and other professionals in a creative exchange, conversations and networking around topics related to economics, technology and environmental change and exploring how finance impacts on the environment and how finance could be reimagined to contribute to the environmental and social agenda.
From examining the complex and sometimes unknown ways in which money and environment might be related throughout history, to opening up discussions around issues surrounding tax havens and offshore economies, algorithms, labour, opaque markets and climate change, we are investigating ideas and questions including whether money and finance can be green or regulated for environmental change, what strategies we can create for ecological good and what knowledge systems we might need and from/with whom.
Co-curated and facilitated by Ismail Ertürk and Irini Papadimitriou.
"Getting a grip on disinformation"
Mieke van Heesewijk, Stijn Peeters, Merel Borger, Erik van Zummeren;
Workshop
With the call 'Getting a grip on desinformation' SIDN Fund invited proposals for projects involving the development of tools and instruments for getting a grip on disinformation (deliberately misleading information distributed with a view to manipulating public opinion).
The coronavirus doesn't exist, 9/11 never happened, and Bill Gates wants to control all of us through a 5G cell tower. Disinformation, and especially its online scalability, may pose a threat to our open, democratic society. The concern is that the deliberate dissemination of misleading information feeds widespread mistrust: of the media, government and among citizens. The rise of disinformation is a potential threat in relation to a well-informed society, democratic processes, polarisation and healthy public debate.
It's important that we get a better grip on disinformation. We therefore called for proposals involving the development of tools and instruments that can help in that regard – by, for example, shedding light on questions such as:
How does the technology that amplifies opinions on platforms work?
Who is engaged in the production, dissemination and amplification of disinformation?
How do the algorithms used by the various platforms work, and who controls them
What motivates certain groups of people to disseminate disinformation and how do they go about it?
What tools can internet users and journalists employ to improve their grip on the disinformation phenomenon?
In this session we will showcase a couple of funded projects and would like to discuss what else is needed to tackle this wicked problem.
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"Governing the data commons: from principles to practice"
Anouk Ruhaak, Greg Bloom;
Discussion
When a group of people decide to collectively govern the collection and use of a shared data resource, a data commons arises. Data commons exist in all shapes and sizes, ranging from collaborative mapping initiatives to health data pools and from open knowledge libraries to agricultural data coops.
But data commons tend to struggle with various challenges and dilemmas, from maintenance to accountability to conflict resolution; governance is hard!
Fortunately, there are known principles of “institutional design” – most prominently described by Elinor Ostrom in Governing the Commons – that can aid communities in the process of deciding how to share resources. In this session, members of a workgroup hosted by the Ostrom Workshop will introduce a draft of a guide to governing data commons, modelled on Elinor Ostrom’s framework for institutional analysis and development. We will engage participants in analysing this governance framework and together develop “Questions to Ask Frequently” that can support communities in developing governance methods that work.
"Hackers to Humanitarians, leverage your tech skills for social good!"
Cynthia Lo, Stefan Edwards, Bart Skorupa, Margaret Tucker, Poesy Chen, Rachel Wells, Ryan McHenry, Rhiana Spring;
Workshop
Join us for an interactive session on how to use your technical skills for social good! We'll hear from GitHub Social Impact Skills-Based Volunteering Program Manager on how GitHub matches employees with social sector organizations to assist with projects, programs, and more through this program. Following the discussion, we'll be joined from a social sector organizations themselves about their mission and how we can help solve a project right then and there! Learn how to leverage your skills to make a positive contribution on the world with this session.
"Hack the Planet: building real tech for the world’s languages"
Hillary Juma, Josh Meyer;
Extended Workshop
For those of us who use English regularly, voice technologies can be a daily part of our lives. We set timers with Alexa, ask Siri for the weather, and make calendar events with Google Home. For those of us living with vision impairments, voice technologies allow us to be a part of a connected, digital world. We might listen to Wikipedia articles read by synthetic voices or send emails via dictation. If you speak English, you might think that this technology is the same as other apps you use, like Twitter or Slack. But, there’s a fundamental difference. For most of the world, people don’t get to use their native language to connect to a digital world. There exists a very tangible gap between languages with and languages without resources to develop voice tech. This gap is a political and economic divide, but as voice tech becomes more democratized and more Open we have the opportunity to create wonderful and necessary applications in more and more languages. The onus is on us to take the step and make the tech.
We can’t sit back and wait for Big Tech to make apps for the rest of the world’s languages. Let’s Hack the Planet! During this workshop, participants will develop applications of their choice incorporating Speech-to-Text and Text-to-Speech for the language(s) of their choice, using Coqui’s existing voice tech which has been trained using data from the Common Voice project.
"Health: The Most Important Tech Tool"
Cassandra Faris;
Discussion
Working in the tech industry often involves spending long hours sitting down, staring at a screen, consuming copious amounts of pizza and caffeine. The work is mentally demanding and can be stressful. In the rush to get everything done, it can be easy to neglect our health. But a healthy body and mind are necessary for effective performance. Based on HR training, research, and personal experience, this session provides realistic suggestions for managing your well-being at work. It covers the connection between physical and mental health, as well as how to discuss these topics with your employer. You’ll leave with a better idea of how to take care of yourself and be a happier, healthier, more productive person.
"How to enhance peer2peer learning with peer2peer payments?"
Philo van Kemenade;
Workshop
With technical, social and political systems changing faster and faster, lifelong continuous learning is more important than ever before. Learning with others can be great to pick up new skills, stay motivated and integrate practice into daily life. But are we optimally equipped for social learning in this day and age?
In this workshop we will consider how modern web infrastructure can equip us for meaningful peer2peer learning at scale. Particularly, we will look at Web Monetization, which is being proposed as a W3C standard at the Web Platform Incubator Community Group.
Most compensation of online content creators currently relies on privacy invading advertisement models. Web Monetization provides an alternative by enabling the streaming of money from web visitors to content creators via an open protocol.
Together we will explore how this model can support equitable social learning exchanges and offer new models for peer2peer learing on the web.
"Ideas, Conversations and Community Building: Understanding technology and its impacts from an intersectional lens"
Shivangi, Varini G, Lian Joseph;
Social Moments
The purpose and ultimate goal of the session would be to provide a dynamic and interactive space that participants and attendees can use to network with each other and further their understanding of the role gender, race, sexual orientation, caste, geographical location and other social factors play in tech policy and regulation of AI. robos of Tech Law and Policy (r-TLP), through its blog, has worked on identifying and bringing to the front, key issues at the intersection of tech and gender and functions to provide a platform for people of marginalised genders to publish their work and engage with others interested in this space. The discussion would function as an essential limb of furthering the social movement around tech policy both from a global and individual stakeholder perspective and the intention would be to connect people that are interested or passionate in this space with other like-minded individuals from a diverse background with their own enriching experiences who would be attending MozFest and otherwise.
"Images of Government: Representation and Bias in Image Search"
Jennifer Miller;
Workshop
This session will be a hands-on exploration and solution-generating session focused on political and ideological bias in image search. We will briefly introduce the topic with some examples of two problems: 1) partisan over-representation in searches related to topics like government, elections, and voting and 2) explicitly partisan images in searches for unrelated, non-political topics. Small groups of participants will then examine and discuss additional case studies in breakout rooms and share their thoughts on possible root causes and potential solutions. After sharing back these case studies, participants will help quantify the scope of the problem by conducting some standard searches for their local geographies and completing a table to summarize relevant findings. We will close by revisiting proposals for potential solutions and next steps.
"IMPAKT Presents: Futures of Control: AI in Criminal Investigation"
Rosa wevers, Robert Glas, Lotte Houwing, ahnjili, Gerwin van Schie;
Discussion
An IMPAKT programme curated by Rosa Wevers.
From predictive policing systems to biometric detection software, AI systems are increasingly changing the field of crime investigation. Artificial intelligence does not only assist in tracing people who have committed a crime but is also used to predict where crimes are likely to occur. While appearing to be objective, systems of AI have become instruments of power that use data from the past to influence the future. As researchers such as Ruha Benjamin have shown, such systems tend to function as ‘mirrors’ of our biased societies, that reflect existing inequalities and program them into the future. This results amongst others in the overpolicing of marginalised communities. Why are these systems so popular, despite the ongoing range of concerns and issues that come to light? What kind of future do they promise to provide? And how can these technologies and the people that use them be controlled and hold accountable?
For this session, IMPAKT brings together experts and artists to discuss the political implications of predictive policing and biometric surveillance. We will explore how AI is used in crime investigation, how it impacts our freedoms and rights as citizens, and how AI can also be used as a tool to create awareness about inequality and fight police brutality.
Speakers:
- Gerwin van Schie, lecturer at Utrecht University
- Lotte Houwing, policy advisor at Bits of Freedom
- Ahnjili, data scientist, Ph.D. candidate, artist, and science communicator
- Robert Glas, artist
"Insights from the Accountability Case Labs project"
Borhane Blili-Hamelin, Jillian Powers, Mrin Bhattacharya;
Workshop
Accountability Case Labs is a Mozfest Civil Society Actors for Trustworthy AI Working Group project. We are prototyping an open, cross-disciplinary community dedicated to understanding what makes building an ecosystem of accountability for AI so disorienting. During our prototyping phase (Nov 2021 to March 2022) we built our MVP of a case study based workshop on bias bounties (January 2022), and conducted qualitative research on the challenges faced by different actors and professionals in the AI accountability space.
The session will start with a short presentation about our open community and our research. From there, we will engage in collaborative activities (collaborative writing, breakout rooms, and discussions) to build shared insights on questions like: What makes AI accountability so disorienting? What is accountability for? What tools are available for greater accountability? Who does accountability affect? How can we better support the many different communities of social and technical actors, researchers, and builders involved in tackling algorithmic accountability?
Our open community is just getting started. This session will also be an opportunity for participants to shape where we go next, and to get involved!
"Kiswahili Common Voice Contribute-a-thon"
Britone Mwasaru, Hillary Juma, Abu, Dr. Lilian, Joan, Donald, Robert, Mark Irura Gachara;
Fringe Events
Thursday, March 3rd, 2022: 9am CET // 11am EAT
Mozilla Common Voice and MozFest are working to improve the representation of Kiswahili in Common Voice; an open, crowd-sourced database designed to help people create voice recognition applications that better understand a diverse set of languages and accents.
The goal of this weeklong contribute-a-thon will be to increase participation for varying dialects and accents of Kiswahili that are typically underrepresented in voice databases globally. Partners include: Devstoriesafrica, Hekaya Initiative, Utafiti Hub, Pwani Teknowgalz, Swahilipot Hub, GIZ FairForward, and KenCorpus.
"Low-resource languages, and their open source AI/ML solutions through a radical empathy lens"
Subhashish Panigrahi, Sailesh Patnaik;
Discussion
Since the inception of the internet 30 years ago, its open infrastructure has fueled the current growth with AI/ML. The latter has been very influential to contribute to diversifying the internet in terms of linguistic and other forms of digital access. The only caveat however is that the AI/ML infrastructure is very business driven as opposed to civil society driven. That is one the key reasons why the majority of the minorized (indigenous, endangered and low-resource) languages are sidelined. In the current state it has become a labyrinth, for anyone who wants to become a first generation digital language-activist, it has become difficult for them to understand "where do I to start?".
This session features a MozFest Trustworthy AI working group project.
"Media literacy: interrogating a skills-based approach to tackling misinformation"
Valensiya Dresvyannikova, Jesus Lau, Lisa Janicke Hinchliffe, Anu Ojaranta, Oskar Laurin;
Discussion
One of the proposed ways of tackling the challenges of mis- and dis-information puts user competencies at the heart of the response – namely, by striving to equip them with the media and information literacy skills to navigate the digital information ecosystem more confidently.
This session aims to explore and facilitate a conversation around the practical implications and lessons learned from the implementation of such skills-based solutions. Both recent theory and practice around this raise important considerations: what skills-based solutions can be scalable and replicable? What effective ways are there to reach underserves user groups? How to address possible skills-motivation gaps – e.g. engaging users who normally would not be interested in a media literacy training, or avoiding an “overconfidence” user fallacy? And, crucially, how does a skills-based response compare to other approaches to mis- and dis-information enabled by today’s tech landscape (e.g. content moderation, flagging and labelling, etc).
Join this interactive dialogue with a small team of experts from academic, practice and policy backgrounds in media literacy - and share your questions, ideas or insights on ways to leverage media and information literacy against misinformation!
"Mitigating bias and discrimination in AI systems through design"
Ahmed Razek;
Extended Workshop
The Information Commissioner’s Office, the UK’s data protection regulator, is running a workshop to better understand the role of design in promoting fairness in the context of personal data and AI. Under the UK General Data Protection Regulation (UK GDPR) organisations must ensure AI systems that process personal data are fair and do not lead to unjust discrimination.
When discussing ways of mitigating bias and discrimination in AI, teams often consider technical solutions like addressing representation in training data sets or measuring model accuracy. Less is understood about how to design experiences and interactions between people and AI that prevent discriminatory outcomes. Designing interactions that better explain automated decisions or provide effective flagging and reporting mechanisms in interfaces could play a greater role in helping people understand AI systems and hold organisations that use them to account.
In the workshop, we will discuss examples of how using personal data and AI can lead to unfair outcomes, assess the role of design in identifying and mitigating these harms and share best-practice approaches for creating AI driven user experiences that are fair by design.
"Movement-building after MozFest session"
Chad Sansing, Dirk Slater;
Workshop
Join MozFest Facilitator coaches Chad Sansing and Dirk Slater to discuss how you can stay connected to one another and the amazing projects shared at MozFest 2022. Leave with clear next steps to take after MozFest to continue your engagement with like-minded participants dedicated to internet health and trustworthy AI.
"MozFest Science Fair 2022"
Temi Popo, Alexis Pauline Gumbs, Toshi Reagon;
Social Moments
** If there is an error message when joining the room - it is due to us being at capacity, we encourage you to join one of the overflow rooms to watch the performance with other members of the community: **
- Overflow Room 1
- Overflow Room 2
- Overflow Room 3
This year, the MozFest Science Fair theme is inspired by Octavia E. Butler’s Parable of the Sower - An Opera. This marks the beginning of a two year collaboration with composer and librettist, Toshi Reagon. Drawing from the core thematic focus, and narrative direction of this powerful book, we will translate and emulate these themes through projects, people and programs from across the internet health movement. The event will be held in Mozilla Hubs in order to build an experimental virtual and interactive imagining of the Parable narrative and world.
We will demo projects that align with the story, and present manifestations of the themes: sustainability, decolonization, planting seeds, and emergence.
Exhibiting projects include:
URSOR - A search engine and browser for kids,
Pollicy,
Internews ADAPT (Advocating for Data Accountability, Protection, and Transparency) Project,
Internet Freedom Foundation,
Climate Refugees,
Waste as a Public Good.
<img alt="MozFest Science Fair 2022 Image" src="https://pretalx.com/media/mozfest-2022/submissions/LH7PUP/MozFest_Screenshots_02_o0eGFhA.jpg" />
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"MozFest Studio - Daily Update"
Dzifa Kusenuh;
Community Plenary
The last MozFest Studio Daily Update! For the last time we dive into one of the 9 spaces, and this time we dive into Rethinking Power and Ethics with Borhane Blili-Hamelin.
Question of the Day | What is your main take away from MozFest 2022?
Community Spotlight | Rethinking Power and Ethics with Borhane Blili-Hamelin
MozFest Perspective | Odanga’s Twitter Research: “Disinformation on Twitter” with Odanga Mandung
Important links from the broadcast:
• Naturescape Restorative Meditation 2: https://schedule.mozillafestival.org/session/FQXDQR-1
• Education in a Mixed AI Reality: A Dialogues & Debates Panel https://schedule.mozillafestival.org/session/YQL9CZ-1
• Zine Fair: https://schedule.mozillafestival.org/session/GTVWQG-1
• Science Fair: https://schedule.mozillafestival.org/session/LH7PUP-1
• Closing Circle: https://schedule.mozillafestival.org/session/B8XQUA-1
"MozFest Studio - Daily Update 1"
Dzifa Kusenuh;
Community Plenary
We’ve passed the halfway mark of MozFest. In this MozFest Studio we dive into the space Digitizing Cultures and Languages with Sadik Shahdu.
Question of the Day | Describe what the internet will be like in five years, 10 years, 50 years.
Community Spotlight | Digitizing Cultures and Languages with Sadik Shahdu
MozFest Perspective | Common Voice: “Hey Google, Can You Understand Me?” with Hillary Juma
Important links from the broadcast:
• MozFest Science Fair: https://schedule.mozillafestival.org/session/LH7PUP-1
• MozFest 2022 Collective Foto Album: https://www.kudoboard.com/boards/Qpe72ts5
• From Art to Culture to AI and back: https://schedule.mozillafestival.org/session/QEYH3B-1
• Mozilla Research Mixer: https://schedule.mozillafestival.org/session/FU8PVF-1
• Poetry slam: https://schedule.mozillafestival.org/session/CC9QYE-1
• AI Reform: A Dialogues & Debates Panel https://schedule.mozillafestival.org/session/3BXVCJ-1
"MozFest Studio - Daily Update 1"
Dzifa Kusenuh;
Community Plenary
Today’s MozFest Studio spotlights the incredible experiences happening in the Sustainability and Climate Change Space and shares what is happening in Mozilla’s Data Futures Lab.
Question of the Day | Imagine you just met someone who has never seen the internet. What is the first thing you would show them?
Community Spotlight | "Blockchain Fairy Tales: What if Happily Ever After is not Guaranteed?" with Facilitators Lance Weiler and Char Simpson
MozFest Perspective | "Mozilla's Data Futures Lab: PLACE” with Chelsea Eversmann
Important links from the broadcast:
• MozFest 2022 Collective Foto Album: https://www.kudoboard.com/boards/Qpe72ts5
• Opening Circle: https://schedule.mozillafestival.org/session/TBCGMH-1
• The Grand MozFest Monetization Experiment: https://schedule.mozillafestival.org/session/REXFA3-1
• AI Reckoning: A Dialogues and Debates Panel: |https://schedule.mozillafestival.org/session/HS3GCX-1
"MozFest Studio - Daily Update 1"
Dzifa Kusenuh;
Community Plenary
Day two of our festival and during MozFest Studio we look at ‘Decolonizing AI Futures’ with Uffa Modey in our Community Spotlight and learn about Afro Algorithms.
Question of the Day | What ‘smart’ Item will you never buy?
Community Spotlight | Decolonizing AI Futures with Uffa Modey.
MozFest Perspective | Creative Media Awardee: Afro Algorithms with Anatola Araba
Important links from the broadcast:
• Creating a safe kid-first model for the Internet: https://schedule.mozillafestival.org/session/9LTQUX-1
• Getting a grip on disinformation: https://schedule.mozillafestival.org/session/RA7V89-1
• Strengthening Data Ecosystems for Ed-tech: https://schedule.mozillafestival.org/session/LRXJAD-1
• Different Types of Thinking are Good and Important!: https://schedule.mozillafestival.org/session/T8779M-1
• Building Ancestral AI: A Dialogues & Debates Panel: https://schedule.mozillafestival.org/session/VBQ8ZN-1
"MozFest Studio - Daily Update 1"
Dzifa Kusenuh;
Community Plenary
The fifth MozFest Studio will focus on ‘Creating neuroverse wellbeing’. In the Community Spotlight we´ll discuss the topic with Julia Simon.
Question of the Day | If Trustworthy AI had a smell/touch? What would it be?
Community Spotlight | Creating neuroverse wellbeing with Julia Simon
MozFest Perspective | AI in Health Care with Pierce Gooding
Important links from the broadcast:
• The challenge of online hate speech : Can AI really help?: https://schedule.mozillafestival.org/session/FJRZPV-1
• Dialogues and Debates: Venture Capital, Digital Rights, and the Future of Responsible Tech Investing : https://schedule.mozillafestival.org/session/WQ9KEC-1
• New Realities: https://schedule.mozillafestival.org/session/ZE8SGX-1
• Mozilla Plenary: metaverse or Metaverse™?: https://schedule.mozillafestival.org/session/GYUL9J-1
"MozFest Studio - Daily Update 2"
Dzifa Kusenuh;
Community Plenary
Today’s MozFest Studio spotlights the incredible experiences happening in the Youth and Futures Space and shares all about the Internet Health Report.
Question of the Day | What does a “healthy internet” mean to you?
Community Spotlight | Youth and Futures with wranglers Joseph Tomas and Dervla O’Brien
MozFest Perspective | “Internet Health Report” with Eeva Moore
Important links from the broadcast:
• Family Resources: https://www.mozillafestival.org/en/family-resources/
• Lightning Talks: https://schedule.mozillafestival.org/lightning-talks
• Meaningful AI transparency for builders: https://schedule.mozillafestival.org/session/S8ZHJL-1
• Use data science to defend rhinos from poachers: https://schedule.mozillafestival.org/session/J7QQ83-1
• The Grand MozFest Web Monetization Experiment: https://schedule.mozillafestival.org/session/REXFA3-1
• <AI & Equality> A Human Rights Toolbox: https://schedule.mozillafestival.org/session/BN7NPN-1
• Creating Wellness Breaks in Your Online Work and Learning Day 1: Relaxing and Healthy Breaks: https://schedule.mozillafestival.org/session/WFZSSM-1
• Youth & Futures Hackathon kick-off: https://schedule.mozillafestival.org/session/A8W7DH-1
"MozFest Studio - Daily Update 2"
Dzifa Kusenuh;
Community Plenary
It is day three of our festival and during MozFest Studio we look at Gender Tech and Intersectionality’ with Lade Ganikale and Sapni G. K. in our Community Spotlight.
Question of the Day | Which movement is missing in the conversation about technology/internet?
Community Spotlight | Gender Tech and Intersectionality’ with Lade Ganikale and Sapni G. K.
MozFest Perspective | Perfect Match At A Price with Kevin Zawacki
Important links from the broadcast:
• Lightning Talks: https://schedule.mozillafestival.org/lightning-talks
• Mindful Art: https://schedule.mozillafestival.org/session/CNZQ8F-1
• Drag Queen Bingo: https://schedule.mozillafestival.org/session/EKYYY8-1
• Infrastructure of care and affection: https://schedule.mozillafestival.org/session/SALQCT-1
• Ecological Imperatives & the Decentralized Web https://schedule.mozillafestival.org/session/QVKYWF-1
"MozFest Studio - Daily Update 2"
Dzifa Kusenuh;
Community Plenary
Today’s MozFest Studio spotlights the incredible experiences happening in the Rethinking Power & Ethics Space and shares all about Responsible Computing.
Question of the Day | What is your favorite session you’ve attended so far?
Community Spotlight | Rethinking Power & Ethics with Irini Papadimitriou
MozFest Perspective | Responsible Computing with Crystal Lee
Important links from the broadcast:
• Misinfocon: https://schedule.mozillafestival.org/misinfocon
• Venture Capital, Digital Rights, and the Future of Responsible Tech Investing: A Dialogues & Debates Panel: https://schedule.mozillafestival.org/session/WQ9KEC-1
• Building Healthy Online Communities - the Creators' Challenge: https://schedule.mozillafestival.org/session/ZETHB7-1
• Tales from the magic forest: https://schedule.mozillafestival.org/session/J8GQPK-1
• Blockchain Fairy Tales: What if Happily Ever After is not Guaranteed?: https://schedule.mozillafestival.org/session/UPGZHH-1
"MozFest Studio - Daily Update 2"
Dzifa Kusenuh;
Community Plenary
Curious to know about Privacy and Digital ID? In this MozFest Studio we have a talk with Lewis Munyi in the Community Spotlight and Jen Calrider talks about the Privacy in the Buyers Guide.
Question of the Day | Do you use a password manager? Which one (leave comment)
Community Spotlight | Privacy and Digital ID with Lewis Munyi
MozFest Perspective | Privacy not Included Buyers Guide: “Privacy Not Included” with Jen Calrider
Important links from the broadcast:
• Web Monetization Experiment: https://schedule.mozillafestival.org/schedule?query=experiment
• Failing Successfully: https://schedule.mozillafestival.org/session/9ZS3PE-1
• Future is Intersectional Conversation: https://schedule.mozillafestival.org/session/ULM7BC-1
• Social Media Platform Accountability: https://schedule.mozillafestival.org/session/X99HLX-1
"MozFest Zine Fair!"
Kiwako Sakamoto, Zannah Marsh;
Social Moments
Join us for our first-ever Zine Fair, held in the interactive, social platform Spatial Chat. Zine makers from around the world are participating, sharing zines on topics as varied as DNA testing and databases, why Wikipedia matters, and coding intersectional AI. Join us to browse all the zines, chat with those zine makers who are present, and hear talks by some of the makers. Or just relax in our zine oasis and listen to chill music prepared for you by our DJ.
For details, please see our blog post :)
Participating zine makers from MozFest Zine Exhibit include:
Estudio Repisa / Sandra Marín, Hyperlink Press & Yellow Pearls Zine, internet teapot, internet teapot & Nushin Isabelle Yazdani, Kelly Wagman & Nicole Wagman, Khushbu Kshirsagar, Lilian Abou Zeki, Leigh Montavon & Sarah Ciston, Sarah Ciston/Creative Code Collective, Sarah Mirk, Sugimotogu, Tactical Tech, WWW.THECOPYRIOTS.COM/ F vandenBoom, Zara Rahman & Jason Li, Sarah Kiden, Jaewon Son & Georgine Obwana, Tiny Tech Zines/Jamie Renee Williams, Tyler Yin, Rachel Simanjuntak, Khadijah Williams, C.X. Hua, Sophie Wang, ann haeyoung, LaToya Strong, Shane Jones, Sarah Sao Mai Habib, and Cella Monet Sum.
"Mozilla Plenary: metaverse or Metaverse™?"
Imo Udom, J. Bob Alotta, Rebecca Ryakitimbo, Apryl Williams;
Community Plenary
As the internet evolves, what will it look like — and who will it benefit? Will it be a Metaverse™, owned and built by only a few and perpetuating the ills of today’s internet? Will it be a metaverse, built and governed collectively? Or, is the premise of a metaverse inherently flawed — a shiny trope that distracts us from confronting the colonial, extractive nature that pervades so many of our digital technologies?
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"Mozilla Research Mixer"
Eeva Moore;
Social Moments
Mozilla researchers are covering a lot of ground. They've discovered partisan influencers evading Tik Tok's ad policies, shone a light on YouTube's secretive algorithms, reviewed the privacy policies of connected devices, identified alternative data governance models, shared stories of AI around the world, and so much more. Join Mozilla researchers to learn about their work and how you can get involved.
The link to Mozilla Hubs for this session will appear on the right side of this page 15 minutes before the session is due to begin.
"Mozilla Senior Fellows Panel"
Amy Schapiro Raikar;
Discussion
Join the conversation and learn about the new cohort of Mozilla Senior Fellows.
"Mystique Afrique Amsterdam @ MozFest"
Temi Popo, Lys Mehouloko, Mystique Afrique;
Social Moments
Mystique Afrique explores Blackness in global cities through art, community, and culture. It is a multi-sensorial, multimedia experience that celebrates the African Diaspora with curated conversations, food, music, fashion, and dance. This year's theme is Ancestral Intelligence: Building in the Black Box. Join us in the metaverse on March 10th for an experience like no other!
<img alt="Mystique Afrique Amsterdam @ MozFest Image" src="https://pretalx.com/media/mozfest-2022/submissions/DKDGAL/Mystique_Promo_Banner_2_Muypstu.png" />
"Naturescape Restorative Meditation"
Natalie Matias;
Social Moments
Immerse yourself in a restorative 30-minute meditation session guided by Global Mindfulness and Meditation Coach, Natalie Matias as a small retreat in your day for self-care. Discover the restorative nature within us through an exploration of nature soundscapes and a mindful meditation through sight, sound, and our breath.
"Naturescape Restorative Meditation 2"
Natalie Matias;
Social Moments
Immerse yourself in a restorative 30-minute meditation session guided by Global Mindfulness and Meditation Coach, Natalie Matias as a small retreat in your day for self-care. Discover the restorative nature within us through an exploration of nature soundscapes and a mindful meditation through sight, sound, and our breath.
"Navegación anónima con Tails"
Luis Fernando Arias;
Extended Workshop
El derecho a la privacidad permite que navegar en Internet sea más seguro, pero es muy complicado navegar en internet y no dejar rastros, la mayoría de sistemas operativos almacenan información que nos permite ser identificables, los sitios web utilizan rastreadores, nuestros proveedores de internet pueden saber que sitios visitamos y los gobiernos pueden realizar bloqueos en momentos complicados.
Para evitar esto podemos utilizar Tails, en este taller analizaremos ¿Qué es?, ¿Cómo instalarlo? y que ventajas tenemos para evitar la censura y la vigilancia.
"OnBoardXR: Live Performance in Mozilla Hubs"
brendanAbradley;
Social Moments
We will present 6 world premiere live performance prototypes in Mozilla Hubs at Mozilla Festival 2022! The hour-long performance is limited to 10 Active/Avatar Participants, 10 Ghost Participants and unlimited livestream viewers.
Watch Recording of Live Performance here: https://youtu.be/gn098KXnMM0?t=1557
More showtimes are available for purchase:
https://www.eventbrite.com/e/onboardxr-4-port-of-registry-tickets-272469743337
Our Theater Lobby will be an integrated part of the Mozilla Festival's virtual MAIN HALL and can be experienced at any time to learn about the performances and installations. Please enter the Hubs Main Hall and make a left at the end of the row of cafe tables to find a persistent link to our port of entry.
More information at http://theaterfestival.online
"Open History Map"
Marco Montanari;
Discussion
OpenHistoryMap is an italian non profit association (APS) founded by a team of archaeologists and software developers based in Bologna, Italy. Our aim is the creation of a web-GIS platform containing spatial historical and archaeological data and to create an ecosystem for historical information to be gathered and exposed. Since 2016 we have been defining standards, publishing (wherever possible Open Access) papers and developing open source applications for the creation of the ecosystem. In the last two years the work has become more and more structured in order to define several instruments that are part of the infrastructure: map, data and several specific usages of the data itself are collected in our system, where users can explore the past and the evolution of mankind. Our map is at https://map.openhistorymap.org/ and the meta-description of the data stored in the system is at https://index.openhistorymap.org/. We are working heavily towards the definition of tools to create structured datasets based on Digital Humanities Semantic Formats for digital historians to prepare common open packages to be reused in various situations.
"Opening Circle"
Sarah Allen, Toshi Reagon, Cassandra Faris, Ayesha Abduljalil عائشة عبدالجليل, ليليان أبوزكي Lilian Abou Zeki, J. Bob Alotta;
Community Plenary
Our opening circle officially kicks off MozFest.
Join us as we open our 5 day festival with Mozilla, hear from the community on why MozFest is so important as a place to connect and collaborate, and what you need to prepare to actively participate according to your own schedule and needs while joining from home.
As is MozFest tradition, you will have the opportunity to participate in our MozFest group photo!
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"Orbits: survivor-centric, trauma-informed and intersectional interventions to tech abuse"
Naomi Alexander Naidoo;
Discussion
Orbits is field guide to intersectional, survivor-centric and trauma-informed interventions to technology-facilitated gender-based violence. Produced over the course of 2020 by Chayn and End Cyber Abuse, the guide focuses on interventions in research, technology design and policy.
At MozFest 2020, we ran a workshop to generate ideas for the guide, and since then we've developed and co-created the guide with input from practitioners and experts from around the world. In this session, we'll share the final guide and lead a discussion on how we can implement it's suggestions to end tech-facilitated gender-based violence.
"Q&A and Viewing New film You are Your profile - Personal considerations"
Jake Blok, Jerry de Mars;
Discussion
Q&A with filmmaker Jerry de Mars and/or team member.
Synopsis of film
The film ‘You are Your profile - Personal considerations when online’ (YaYp) is an exploration into the considerations people make when online. In this film a number of personal stories of young people is presented. The notion is that we all are active online in today’s more and more connected society. However, are we aware of the considerations we make when interacting with others and logging on to online platforms? How do we deal with the wrongdoings that might happen to us online? And are we aware of what is exactly happening, and can we actually control what happens, with our person data?
This film is part of the growing documentary project with a focus on the importance of being able to control your own person data online. Every year a new theme is chosen. The 2021 theme is ‘personal considerations when online’. Gradually almost every person's life is connected to the Internet and uses it in their daily lives. But how aware are we about the person data we share when we are connected to the Internet? For people this might have safety implications and it can harm our wellbeing and dignity. This film has the aim to enlight answers and to debate openly the future of our lives in an online connected society.
"Reclaiming AI Futures - Interaction"
divij;
Workshop
This session will be a space for interacting with the contributors to the art exhibition 'Reclaiming AI Futures', curated by Mozilla Fellow Divij Joshi. The contributors and the project curators will discuss their work, the larger ideas motivating the project, and ways to build on and take forward the ideas of reclaiming AI futures.
"Reimagining Consent: A Discussion & Gallery Exhibition"
Georgia Bullen;
Discussion
Everyday we make decisions about platforms and tools, making choices about how we consent to our data and information being used by third parties. New privacy regulations in various jurisdictions have had a drastic impact on that consent experience — some positive, some negative. We will host a discussion and virtual gallery to showcase examples of consent user experience from around the world — both current realities (good + bad) and exploratory ideas and concepts. See what’s been submitted to the Consent Gallery (https://airtable.com/shrZSumhBM6DSyILb/tblkHBbRqiAaV22gs) so far and submit something today (https://airtable.com/shr3xzoqGYuVgg42n) to include in the discussion! Need inspiration or want more info—view the guidelines for submissions (https://simplysecure.org/blog/open-call-submissions-for-reimagining-digital-consent-a-discussion-gallery-exhibition-at-mozfest-2022/).
"Rethinking the Digital Space (Metaverse) through a Non-Digital Means: Philosophy"
Stella Teoh, Malaysian Philosophy Society;
Workshop
Decide how life should be in the Metaverse before the Metaverse decides for you.
"Metaverse" is a buzzword you have probably heard by now. However, with all the hype, would the Metaverse really turn out to be what we are expecting? With the possibility of living whatever kind of life we want in the Metaverse, would the experience in Metaverse be more inclusive and comparatively better than the physical world? Or would it turn into a space that breeds more sexism, racism, inequalities and other social illnesses with the stripping of morality and boundaries?
As citizens of the new world of the internet, be amongst the firsts to discuss the many important questions that surround the Metaverse before we are drowned by the tides of change. We will discuss the nature of this new reality, digital justice and morality, as well as the changes in modes of living and knowing the Metaverse can bring.
Join us, be ready for this new technological evolution with an instrument that has persisted even before the technological age, philosophy.
"Revitalization of the endangered indigenous African languages"
Olushola Olaniyan, Satdeep Gill (WMF), Amrit Sufi;
Discussion
The impact of colonization in Africa has played a major role in the determination of the extent of the vitality and endangerment of many of the indigenous languages in Africa.
Language is one of the core values of people and the identity of any nation. It's been established by the United Nations that “every two weeks, at least one indigenous language vanishes, leading to two language extinctions each month
This panel tends to gather language enthusiasts across the movement to brainstorm on the role of Wikipedia and its sister project in supporting various activities, channeled at preserving indigenous languages.
"Rhizome of Babel Workshop"
Kosisochukwu Nnebe, Lucas;
Extended Workshop
The Rhizome of Babel is an interactive digital platform that re-examines the biblical tale of the Tower of Babel from an anti-imperialist and ecological perspective. Where the original story positions a unified language as that which enables humankind to breach into the heavens, this platform explores an alternative telling that sees in the tale evidence of human hubris and parallels to imperialist conquest and epistemicide (the destruction of indigenous knowledge systems and languages). Rather than punishment, the installation envisions the abundance and diversity of human languages – an ecosystem in its own right – as an alternative vision of prosperity that privileges biodiversity over monoculture, opacity over transparency, relationality over hierarchy. Rather than a tower, what is aspired to here is a rhizomatic network that stretches and expands sideways, that embraces and respects incommensurability, that asks that we develop new ways of understanding one another.
The platform will be produced in three stages: the creation of a micro platform to collect audio submissions, a co-creation workshop with participants, digital theorists and creative technologists during MozFest 2022, and finally the development of an interactive audio-visual environment built using Mozilla Hubs.
"Screening of “Courage جراه”"
Yasmine Rifaii, Caroline Creton, Dayna Ash;
Social Moments
Courage is an experimental short film comprised of eight (8) acts written by Dayna Ash, and directed by Malak Mroueh that tackles identity, freedom, and queer being in the Middle East, where the dominating discourse aims to dim and erase their existence.
The "closet" is portrayed as a labyrinth of societal norms and traditions that bodies must navigate to survive, rendering coming out and the spectrum in-between as equally courageous and bold acts. The production of such a film in the Middle East is in and of itself an act of resistance. With the social and systematic persecution of LGBTQI individuals, the variables of coming out are not simply based on the acceptance of the family and friends; but are accompanied by fear of intolerance and violence manifested through formal authorities and informal traditional norms.
The dominating discourse is one that promotes coming out of the closet as the desired goal for LGTBQI individuals, to the exclusion of those, who through a courageous act themselves, do not come out. Their life is not framed by the reality of a closet, instead, they use other cultural frameworks to construct their queer reality. Courage engages with what it means to live a queer existence in the Middle East.
"So you've started a data commons. Now what?"
Anouk Ruhaak, Emily Jacobi, Hays Witt, Erik Forman;
Discussion
In this session we will discuss the intricacies of data governance within data commons and cooperatives. What happens when groups of people come together to collectively govern their data? How do they make decisions about that data? And how do they decide who has a seat at the table? We will explore the obstacles and questions facing such organisations as well as approaches have they found to tackle these questions?
You will hear from Mozilla's Data Futures Lab grantees - Digital Democracy, Driver's Coop and Driver's Seat - who will share their experience building data governance models in the wild.
"Spanglish taller de fanzines / Spanglish zine-making workshop"
Sarah Mirk, Sandra Marín;
Fringe Events
In this hands-on bilingual Spanglish zine workshop led by Sandra Marín (Chile/Argentina) and Sarah Mirk (U.S.), participants will learn about the power of zines for self-expression. We will look at zines from around the world and learn how to make our own. Each participant will draw their own mini-zine, which can be shared online or shared at MozFest Zine Fair & Exhibition.
En este taller bilingüe de zine dirigido por Sandra Marín (Chile/Argentina) y Sarah Mirk (EE.UU.), los participantes aprenderán sobre el poder de los zines para la autoexpresión. Veremos zines de todo el mundo y aprenderemos a hacer nuestros propios. Cada participante dibujará su propio mini-zine, que puede ser compartido en línea o compartido en la feria virtual zine de Mozilla Festival.
Note: We are also going to make a google form that participants can fill out and upload a photo (or scan) of one page of their zine. Sandra and Sarah will take these pages and make them into a collective zines, which will be available as a PDF and also mailed out in print to anyone who shares their address.
"Spelman College Presents: An Intersectional Discussion on AI Mediated Microaggressions"
Jaye Nias, Princess Sampson, Jainaba Seckan, Ratziel Ogburu-Ogbonnaya;
Discussion
In this sibling circle for women or non-binary people of the African Diaspora, we will deep dive into a riveting Black Paper authored by a research team at Spelman College. There has been previous scholarship that highlights the impacts of microaggressions felt by Black women. The presence of African-American women as situated in the US requires navigation of many oppressive constructs rooted in both systemic racism and gender bias.
As much of our world shifts interactions, both social and transactional, to technology mediated engagements, algorithmic and automated systems are largely unavoidable as gateways to economic and social capital. Many of these systems often exacerbate or further perpetuate biases prevalent in our society.
The work of this group seeks to identify and explore how automated biases create microaggressions that adversely impact the quality of the lives of Black women.
We invite everyone to read the Black Paper, but are asking that only those who identify as women or non-binary people of Africa and the Diaspora join this sibling circle.
"Strengthening Data Ecosystems for Ed-tech"
Niyoshi, Babitha, Zeina Abi Assy;
Workshop
Some aspects of education as we know it have changed, been left behind, or improved with new technologies, and folded within this change is a big question: what happens to a learner’s data? If you are a parent, educator, academic, administrator, reformer or simply an engaged adult, do join our workshop. We welcome people from diverse backgrounds to build a multidimensional perspective on data governance in education.
Through this session we will use creative exercises to start a conversation on e-learning in schools; data stewardship; students’ data privacy; informed consent and more. Where do we currently stand? What are our hopes and concerns for the future of education, and educational data? How might we bring our preferred futures into being?
We have another workshop on the same topic for young people too. You can access its details (here.)[https://schedule.mozillafestival.org/session/NPQCQL-1]
"Teaching Responsible Computing, an Interdisciplinary Approach"
Kathy Pham;
Discussion
Slide Deck with Resource Links: https://docs.google.com/presentation/d/1f4na0bI48ZIIru5ZBz40MAUuRQ9Uod9DfPqX0kBoT3E/edit?usp=sharing
Ethics and social responsibility are topics that have existed in computing curricula for decades. These topics are even more critical in computing as technology becomes more ubiquitous throughout society and in individual lives. Mozilla’s Responsible Computer Science Challenge team and funding partners believe computing programs must collaborate with experts across disciplines to thoroughly and thoughtfully integrate ethics and responsibility in curricula. https://foundation.mozilla.org/en/what-we-fund/awards/responsible-computer-science-challenge/
We invite all who are interested in integrating ethics and responsibility in computing curricula with a focus on cross-discipline and honoring different expertise. Grantees from Mozilla’s Responsible Computer Science Challenge will showcase the interdisciplinary approaches to curricula in their own computing programs that they have developed in the last three years. Session participants will be invited to share their own experiences with teaching responsible computing and will learn about opportunities to join our global Community of Practice (https://foundation.mozilla.org/en/what-we-fund/awards/responsible-computer-science-challenge/resources/) and to contribute to the Teaching Responsible Computing Playbook (https://foundation.mozilla.org/en/what-we-fund/awards/teaching-responsible-computing-playbook/).
Presenters:
Irina Raicu, Santa Clara University
Kimberly Boulden, Kenneth Joseph, Dalia Muller, Atri Rudra, University at Buffalo)
Ron Cytron (Washington University in Saint Louis)
William Cochran, Jenna Donohue, Trystan Goetze, (Harvard University)
Cathryn Carson (University of California, Berkeley)
Crystal Lee, Kathy Pham (Mozilla)
"Tech Funding Landscape"
Deleted User, Esra'a Al Shafei;
Discussion
Hear from funders on how they are supporting projects and trends that will impact the community in the future. In addition, hear about the current state of the Internet Freedom funding ecosystem, and what they are working on to improve the sustainability of the space in the long term. Speakers: Laura Cunningham (Open Technology Fund), J. Bob Alotta (Mozilla Foundation), Jac sm Kee (Numun Fund)
"The Disinformation Ecosystem in 2022: New, Authentic, and Recurring Actors"
Rachael Levy, Ari Ben Ami;
MisinfoCon Discussion
What does the disinformation ecosystem of 2022 look like?
-
There are many new actors engaged in disinformation campaigns: Domestic political actors,
ideologically motivated activists (from hactivists, terrorist organizaitons, and religious sects)
and individuals motivated by financial gain. -
Inauthentic activity is increasingly being orchestrated by authentic accounts. Information
operations are often synonymous with botnets and fake accounts. As social media platforms
become more adept at identifying and actioning these networks the nature of information
operations are subsequently evolving and becoming increasingly complex, utilizing authentic
actors and/or deep avatar accounts. -
Both new actors and authentic accounts are recurring and prolific. Whether it's Russia, China
or Brazil each geography has a set of prolific and recurring threat actors. Who are they? Who
stands behind them? And what is their modus operandi? Often, these actors are not bound by
their geos and overlap, running parallel operations making it even more difficult to pinpoint
attribution.
In this session, we'll discuss the importance of the burgeoning threat actor ecosystem and the key
prolific disinformation actors and how they operate. Join us ActiveFence’s Head of Geopolitical Risk,
Rachael Levy, to learn what you can do to counter these threats to information integrity.
"The Future is Intersectional Conversation: Future of Media & Technology"
Cheryl Finely, Kamal Sinclair;
Community Plenary
Join a conversation between Cheryl Finley and Kamal Sinclair about the future of media and technology and who holds the power to shift the cultural landscape. The conversation is a continuation of The Future is Intersectional series in partnership with Spelman College.
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"The Grand MozFest Web Monetization Experiment - an Introduction"
Erica Hargreave, Chris Lawrence;
Workshop
Welcome to Part 1 of our Series on 'The Grand MozFest Web Monetization Experiment'.
Join us in learning about and discovering how you can participate in the Grand MozFest Web Monetization Experiment at MozFest 2022. In this introductory session to the experiment, we will explore what the Web Monetization Standard is, the philosophies behind it in building towards a more equitable web, the basics to how it works, the opportunities it presents and current challenges that it faces.
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"The Grand MozFest Web Monetization Experiment - Creating an Action Plan"
Erica Hargreave, Chris Lawrence;
Workshop
Welcome to Part 3 of our Series on 'The Grand MozFest Web Monetization Experiment'.
As the Grand MozFest Web Monetization Experiment gets underway, we invite you to discover how creatives are Web Monetizing their resources and assets through a variety of case studies. Then join us in brainstorming and troubleshooting as you and your fellow attendees begin to create your own action plan for Web Monetizing your MozFest resources and assets to support an internet health initiative of your choosing. We will then show you where to post your plans, in order to have your MozFest resources and assets added to the MozFest Web Monetized Galleries of work.
"The Grand MozFest Web Monetization Experiment - Creating an Action Plan"
Erica Hargreave;
Fringe Events
Welcome to Part 3 of our Series on 'The Grand MozFest Web Monetization Experiment'.
As the Grand MozFest Web Monetization Experiment gets underway, we invite you to discover how creatives are Web Monetizing their resources and assets through a variety of case studies. Then join us in brainstorming and troubleshooting as you and your fellow attendees begin to create your own action plan for Web Monetizing your MozFest resources and assets to support an internet health initiative of your choosing. We will then show you where to post your plans, in order to have your MozFest resources and assets added to the MozFest Web Monetized Galleries of work.
"The Grand MozFest Web Monetization Experiment - Getting Started"
Erica Hargreave, Chris Lawrence;
Workshop
Welcome to Part 2 of our Series on 'The Grand MozFest Web Monetization Experiment'.
In this session, we will help you get started on Web Monetizing your MozFest resources and assets, enabling you to take part in the Grand MozFest Web Monetization Experiment at MozFest 2022 and support an internet health initiative of your choosing. Discover how to get your pre-paid Coil account with free MozFest tips to give out, and learn to Web Monetize your MozFest resources and assets including websites, videos, forum discussions, events, shared workspaces, games, and more with payment pointer directed to internet health initiatives of your choosing.
"The great importance of design for alternative technology"
Mieke van Heesewijk, Martijn Dekker, Ted, Paul Francissen, Matthijs Hoekstra, Rogier Klomp;
Discussion
Alternative digital tools are not known for their user-friendliness. Although the open source aspect is often considered more important, design does play a role in accessibility for new users. How do you lower the threshold for the general public, or for specific target groups?
In the commercial world there is the urgency to make money. This ensures that the design fits well with the end user. In the world of alternative technology, people often only think from technical possibilities, not from design. The two communities often simply do not find each other. The discussion is important: even when the designer wants something that may not be technically possible, it can give the technician a different view.
The design knowledge of alternative communities mainly lies with programmers who design a bit themselves. It's important for designers to feel more attracted to participating in the development phase.
In the call 'Developer meets designer' last year, together with Cultuur Eindhoven, `SIDN fund created a call for proposals to designers who want to use their expertise to increase the impact of promising - but mainly technically oriented - projects. The process resulted in eight successful matches. developers and designers have worked together on the project plans to increase the design, user-friendliness and/or accessibility of the projects.
During this workshop we would like facilitate a discussion between developers and designers.
As inspiration, a few projects can show how this collaboration between developers and designers took place and in which projects it resulted.
"The Green Edge"
Elena Poughia, Christian Buggedei;
Discussion
Data has been described as the new oil. In fact. It’s more like the new oil spill. The overwhelming percentage of data is unusable, or “dark data”. These information assets are collected, stored, and processed for regular business activities. But at great expense, and often at greater risk than value. Every year the amount of data increases exponentially and is typically measured in zettabytes. To put this into perspective, a zettabyte is a unit of information equal to one sextillion bytes, or a billion terrabtyles. That’s a whole lotta bytes. Possibly a more concrete way of putting this would be to think of Netflix. If you can imagine all the the movies, documentaries, and tv shows on that platform, - then multiply it by a thousand - you wouldn’t even begin to approach the annual onslaught of data we’re facing. Now couple that by considering the amount of energy it takes to store this information tsunami on central servers. The numbers here are equally startling. And what do you get from this steaming heap? Remember, the majority of this is dark data: Useless information stored at great expense with energy that could be saved or put to more sustainable use. At polypoly [cf. https://polypoly.coop/] we believe we've come up with a solution: Storing and processing our personal data where it originates, on our end devices. This talk will detail how polypoly is developing its Edge technology and data cooperatives with a wise eye to sustainability.
"The Impromptu Poetry Slam: MozFest Edition"
Hillary Juma;
Social Moments
The Impromptu Poetry Slam, provide space for you to reflect on your experiences at MozFest. The session will guide you though poetry making through the use of prompts. You can optionally perform or submit your own creations to the Common Voice Sentence Collector. Common Voice is an iniative to make voice technology better understand langauges and accents.
"The intersectionality of AI and LGBTQIA+ advocacy in the MEA region"
Hatem Haddad, Khawla;
Workshop
With the rise of social media, citizens of the MEA region started using the internet as a way to amplify their voices discussing diverse topics, and defending the causes they believe in. Hence breaking taboos and taking a virtual space to exist despite their differences. In Tunisia, LGBTQIA+ movement and organizations utilized this boom to thrive. However, Hate Speech against this community arose parallely on different social media platforms.
On the most commonly used social media platforms, the surge of cyberviolence against the LGBTQIA+ community in the MEA region has become so common that more often than not, it translates to actions on the ground. Local dialects, automatically undetectable, used to target Queer people and activists vary from attacks against gay and lesbian people, transgender individuals, and anyone with a non-normative identity. Consequently, manual identification of Hate Speech on social media has become impossible, making the need for AI-based automatic identification assistance vital.
iCompass, an AI startup, and “the Initiative Mawjoudin We Exist for Euqality”, an LGBTQAI+ NGO, both Tunisia based, propose to lead a workshop on how to build an automatic system based on AI, ranging from collecting high quality data, methodology to annotate the data, and training AI models.
This workshop will be focused on dialects in the MEA region, and more specifically the Tunisian dialect as a use case. The aim is to bring attention to the lack of dialect-based tools that automatically detect Hate Speech in addition to developing community-adequate solutions.
"The Metaverse vs the Publicservice internet"
Ian Forrester;
Discussion
This is a open discussion about the public values of new technologies like the potential of a Metaverse.
Do they serve a greater purpose or only benefit the privileged few?
They say Technology is neutral, through this discussion we will put this to the test.
Starting with a overview of the BBC's public service internet work-stream before a discussion with guests and people participating about different technologies. We will mark each one between 0-10 for their potential to support marginalization communities and elevate unheard voices in a non-exploitative sustainable manor.
"The Rhizome of Babel Launch"
Kosisochukwu Nnebe, Lucas;
Fringe Events
The Rhizome of Babel is an interactive digital project created by Kosisochukwu Nnebe, in collaboration with Lucas LaRochelle, that re-examines the biblical tale of the Tower of Babel from an anti-imperialist and ecological perspective. The final product will be an interactive audio-visual environment in Mozilla Hubs (with spatialized audio) that envisions the abundance and diversity of human languages – an ecosystem in its own right – as an alternative vision of prosperity that privileges biodiversity over monoculture, opacity over transparency, relationality over hierarchy.
Visit the Rhizome of Babel installation in Mozilla Hubs by clicking here.
<img alt="RoB" src="https://pretalx.com/media/mozfest-2022/submissions/LB7KSC/RoB__eu2Kh86.jpg" />
"The role of licenses in Trustworthy AI"
Bogdana Rakova, Megan Ma;
Discussion
Many issues related to digital privacy, the lack of human agency, corporate accountability, and transparency of AI-enabled technology are rooted in decisions surrounding data collection, annotation, ownership, processing, and erasure practices. Prominent scholars have proposed that AI developers could use licenses as a way of preventing unintended and irresponsible use of their systems. Furthermore, indigenous communities have proposed bottom up data sovereignty licenses that aim to protect their identity and cultural heritage.
We set out to engage in a discussion on the role of contracts and licence agreements in AI, centered on the question of: how do we enable adequate consent that ensures a trustworthy relationship between all parties, while enabling contestability of algorithmic outcomes?
It’s long been said that “I agree to the Terms-of-Service” is the biggest lie on the Internet. The legal industry, as well, has struggled to author contracts that allow for more granular levels of understanding and sufficient flexibility to accommodate for the unique circumstances of individuals. To remedy that, scholars have proposed computational contracts, built through a contract definition language that is sufficiently expressive to allow for improved human agency.
In this session, we hope to demonstrate the relationships among these active areas of research and engage participants in a discussion on what new tools and interfaces (social, legal, technological, and other) could enable a more trustworthy relationship between the public and consumer tech companies.
"Transgender Healthcare and Wellbeing in the Digital Age"
Rin Oliver, Jasmine Henry, Kim Crawley;
Discussion
There are a variety of applications that offer telehealth transition related services to transgender and nonbinary individuals. These exist in the US, UK, and in other locations worldwide. Depending on one’s location, the transition-related services they can access via telehealth are limited. In some cases, telehealth is more expensive than traditional in-person services. Are medical providers aware of this? Many subscription-based models charge far more for services than in-person services do. Is this taking advantage of a vulnerable population?
As developers coding telehealth applications, how can we ensure that we are doing so in a secure way? In this presentation, we will explore and brainstorm the ways in which healthcare and insurance providers, or public health initiatives better serve transgender, gender diverse, and nonbinary individuals accessing care?
Attendees will also iterate on the software development side of this issue. In particular: Are Telehealth applications being developed securely? How can we ensure that DevSecOps best practices are being followed and adhered to in regards to handling people’s personal identifiable information?
We will be brainstorming and hacking on ways to develop software in a secure fashion that considers the whole person. Often, transgender, gender diverse, and nonbinary individuals have co-morbid medical concerns such as depression, anxiety, chronic pain, are disabled, or are neurodivergent and are in need of a unique combination of mental health, physical health, and emotional health services.
"Trustful neighborhood relations with intelligent things in cities"
Iskander Smit, Maria Luce Lupetti;
Extended Workshop
More and more intelligent systems and objects become part of our lives, and part of the city infrastructure. It can be a partner for citizens to deal with the growing complexity of our connected world, and it can deliver new opportunities for creating services. The city as an AI also challenges the agency we have in our lives.
The neighborhood is an important place for exploring these topics. In two field labs Cities of Things is working towards research through design projects with citizens. What are neighborhood services to improve logistics and create circular waste flows? How can an intelligent system become shared resources, how can it become a place for local entrepreneurship?
We now work in neighborhoods in the cities of Rotterdam, Amsterdam, and Munich on these projects and like to use the broad community of Mozfest for reflection and inspiration on developing citizen-proof concepts. In the session, we focus on the differences between cities for engaging neighborhood communities and the relations between citizens.
In the session, we will sketch the future challenges where we will be living together with technology that is more an initiating partner than a tool. Maria Luce Lupetti will share relevant research. We invite the participants to map one of the case studies on their own neighborhood and discuss what would be the drivers to design cities of things resourceful and responsible.
The learnings will be input for the shaping of the neighborhood projects that Cities of Things is involved in.
"Trustworthy AI Education Toolkit"
Moh Tahsin, Kassandra Lenters, Andy Forest, Brenda Shivanandan;
Discussion
Teachers of all subjects, students of all ages, come to this session on a collection of resources for introducing Trustworthy AI literacy into any classroom!
If we want the current generation to grow up having the societal discussions to advocate for a Trustworthy AI future, our educators will have to distill and share their best practices.
Educators have found that establishing “computational thinking” literacy as early as possible leads to much easier understanding of coding itself later in life. Coding education tools such as Scratch, MakeCode and others have enabled k-12 educators without computer science backgrounds to introduce coding into their classrooms.
AI education tools have started to appear, along with AI education frameworks that describe the thinking skills that need to be nurtured. The Mozilla Trustworthy AI Education Toolkit working group has assembled a guide for non-computer science educators to introduce AI education into their classrooms. This toolkit consists of 4 main parts: Components of AI, Principles of Trustworthy AI, Educational Activity Examples and Assessment. The parts of this toolkit were collected from materials and best practices created by AI researchers and AI education organizations from around the world.
We not only want to share this toolkit, we want your input! If you’re new to AI, what catches your interest, and what’s preventing you from getting started in your classroom or life? If you are learning about AI, what hooked you? What do you see shaping youth’s future around AI?
"Trustworthy AI Working Group Plug and Play"
Temi Popo, Chad Sansing;
Workshop
You can meet the leads of most of the AI Builder & Civil Society Actor Working Group Projects at this part-demo, part-networking event for the MozFest Building Trustworthy AI Working Group.
AI Builder Projects:
We think of AI Builders as people who create Trustworthy AI and machine learning (ML) products, like data scientists, developers, engineers, and product managers.
- Visualizing Internet Subcultures on Social Media
- Developing a Trustworthy AI Scorecard for Mozilla Data-Powered Products
- Low-Resource Languages, and their Open-Source AI/ML Solutions through a Radical Empathy Lens
- Fair Voice: What Happens When Your Voice Signal Becomes Your ID?
- Youth for a Safer Internet: How Youth can Shape Artificial Intelligence for Online Safety (MoSafelink)
- Truth of Waste as a Public Good
Civil Society Actor Projects:
Launched in 2021, we think of the Civil Society Actors stream being made up of people outside government and industry working in their local and global communities through art, journalism, policy-making, research, scholarship, activism and technical literacy efforts, who care about the impact of artificial intelligence.
- AI Governance in Africa
- A Feminist Dictionary in AI
- Accountability CaseLabs
- Trustworthy AI Education Toolkit
- Black Communities Data Cooperatives
- Audit of Delivery Platforms
- Harnessing the Civic Voice in AI
Want to interact with these projects and our incredible community? Join us!
"Truth or Dare: Understanding marketing claims about security and privacy"
Liz Steininger, Rand Hindi, PhD, Matthew Hodgson;
Discussion
In seeking more secure and privacy enhancing tools contained within the services we use and for the products we create, we often rely on the claims made in marketing materials. However, we know that the simplified explanations of security and privacy features may not tell us the nuance we require. This panel will explore how users can interpret the variations of claims and better understand what risks they are accepting in using tools with such security and privacy features.
Directly after this session, join us in Spatial Chat. We will have team members available to answer questions and continue the conversation from the panel. In addition to the Q&A, we will highlight features and a description of our soon to be released data storage service PrivateStorage.
"Turing Coding Challenge"
Swathi Dharshina, Jose Alves Durand Neto, Vanky Kataria, Sam Adekunle, Surabhi Chandra, Khalid Syed;
Workshop
Turing in partnership with MozFest is hosting an interactive session on March 9th. Audiences will get a glimpse of the Turing Coding challenge as they join the live coding session with our expert Jose Durand. Durrand is going to break down the myths and processes of attempting a coding challenge using the Turing coding challenge as a template.
There are prizes to be won at the end of the session. Book your slots!
"Under Surveillance: A Palestinian Story"
Mona Shtaya l منى شتيه, Nadim Nashif, Sophia Goodfriend, Mohammad Al-Maskati;
Discussion
Provide a space for the panelists to discuss how the Israeli Surveillance State impacts Palestinians day-to-day lives. Each panelist was selected specifically to provide a unique perspective. Nadim will be able to give context to the history of surveillance against Palestinians in all of its many forms, and provide context to how surveillance has impacted recent events in Palestine. Specifically, the covid-19 pandemic and the May 2021 uprisings. Sophia will be able to speak to the realities video surveillance in East Jerusalem, and cite her recent published report, where she spoke with many Palestinian residents of East Jerusalem. This will highlight the level of surveillance many Palestinians are under on a daily basis. Mohammed will be able to provide a regional perspective across MENA, and as a former political prisoner in Bahrain.
"Urgent Adaptability"
Yasmine Rifaii, Caroline Creton, Dayna Ash;
Discussion
Haven for Artists is a feminist cultural organization (NGO) based in Beirut, Lebanon, working at the intersection of art and activism since 2011.
A revolution, a failed state, a pandemic and a blast. The past 2 years were times of communal loss, times that made us rethink our identity, our belief systems, our strategies, and the future as a whole, times that forced us to adapt urgently to the constantly evolving situation.
Due to COVID-19, we have been forced to develop digital alternatives. We had to re-think our means of actions and of connection to each other in the digital sphere. The financial and political instability created a need for a necessary affirmation of our roots, mission and goals. We produced the film "Courage” which tackles identity, freedom, and queer being from the point of view of Arab queers. We organized protests and marches against GBV and sexual harassment in Beirut. We launched advocacy and awareness campaigns. Following the Beirut Port Blast, Haven transformed itself to respond to the emergency. We turned our offices into a shelter, and raised funds that were distributed in cash through multiple ways to the queer and women communities.
Whether it’s digital or physical, we always adapt to keep on offering a safe collaborative space, for artists, activists, and creators to work and produce the art that will always stand as the best witness of time and an active tool for change.
"Venture Capital, Digital Rights, and the Future of Responsible Tech Investing: A Dialogues & Debates Panel"
Shu Dar Yao, Michael Kleinman, Jon Zieger;
Community Plenary
How can technology founders, civil society and policy makers engage traditional VC investors to actively foster a responsible and intersectional tech ecosystem that disrupts for good, and seeks to build rather than break? Is it fair to say that traditional VC is fundamentally at odds with a vision of tech that is healthier for all?
Decisions made by major US-based VC funds have ripple effects felt around the world, not only because they invest across global markets, but also because many entrepreneurs worldwide seek to replicate the models US VC investors drive to scale. Ownership and promotion of responsible tech among the most influential VCs has the potential to change the entire future tech landscape for the better. This solutions-oriented session seeks to foster collaborative discussion among advocates working towards public good and tech executives that sometimes straddle the divide between doing what is right for their broader community of stakeholders and doing what will maximize returns for their VC investors.
Read these articles to learn more: [Venture Capital Undermines Human Rights] (https://techcrunch.com/2021/08/04/venture-capital-undermines-human-rights/%5D), [Managing the Unintended Consequences of Your Innovations] (https://hbr.org/2021/01/managing-the-unintended-consequences-of-your-innovations), the [Amnesty Report Risky Business: How Leading VC Firms Ignore Human Rights When Investing in Tech] (https://www.amnestyusa.org/reports/risky-business-top-10-leading-venture-capital-firms-failing-in-their-responsibility-to-respect-human-rights/, and the book Intended Consequences [https://intendedconsequences.com/).
This session is being livestreamed and can be viewed from the main page of the MozFest Plaza.
"What does it mean to host a Digital Rights Event Online"
Sarah Allen, Nikki Gladstone, Mohamad Najem, MARS Marshall;
Fringe Events
Continuing the conversation from Bread&Net Festival in November 2021, Nikki Gladstone from RightsCon, Moussa Saleh of Bread&Net, and Mars Marshall from Allied Media Conference join Sarah Allen at MozFest to discuss how community spaces have transitioned online and explore what’s in store for the future of events.
Join live on Twitter Spaces - by clicking here.
"What is the future of open source community building ?"
Hillary Juma, Abigail Mesrenyame Dogbe, Michael Hoye, Matt Germonprez;
Discussion
Provide space and reflection on open source community building:
- How might we balance reward and enablement?
- How do we know when a dynamic becomes exploitative?
- How might we balance financial sustainability and imbalance of power in funded open source projects?
- How might we ensure that these projects undo the harms of capitalism, rather than feed them?
- How could praxis such as design justice and ubuntu AI play a role in informing community building?
"With what words should I speak? Impact of Voice technology on Language Diversity"
Hillary Juma, Josh Meyer, Subhashish Panigrahi, Katri Hiovain-Asikainen, Dewi Jones, Omolabake Adenle, Francis Tyers;
Discussion
Voice Technology touches so many domains in our lives from virtual assistants, accessing financial services, healthcare to automatic subtitle generation. Unfortunately, most languages in the world are not equally represented in voice technology. Language isn’t just about words - it’s about livelihoods, trans-generational knowledge, and culture. According to the Digitial Financial Service Lab and Caribou Digitial in 2018, some commercial voice processing tools don’t favour “languages of the poor”.
We want to bring together language activists, from low-resource languages, computational linguistics, and you to reflect on your experiences of voice technology. The session will include guest speakers influencing and shaping voice technology, who reflect on thought-provoking questions such as:
What are the success and challenges in bringing the world's languages to the internet age?
What narratives do our communities have on voice technologies such as voice assistants?
What support and resources do low-resource languages use to digitize their language in voice technology?
In digitizing languages through voice technology how are we ensuring languages communities benefit the most from them?
Considering the ecological influence on language diversity, how can we sustainably develop voice technology without further putting people and communities at risk?
By the end of the session, we would like to create with participants a manifesto on what the future of health voice technology digitizes languages equally and equitably.