MozFest 2022

Mitigating bias and discrimination in AI systems through design
Language: English (mozilla)

The Information Commissioner’s Office, the UK’s data protection regulator, is running a workshop to better understand the role of design in promoting fairness in the context of personal data and AI. Under the UK General Data Protection Regulation (UK GDPR) organisations must ensure AI systems that process personal data are fair and do not lead to unjust discrimination.
When discussing ways of mitigating bias and discrimination in AI, teams often consider technical solutions like addressing representation in training data sets or measuring model accuracy. Less is understood about how to design experiences and interactions between people and AI that prevent discriminatory outcomes. Designing interactions that better explain automated decisions or provide effective flagging and reporting mechanisms in interfaces could play a greater role in helping people understand AI systems and hold organisations that use them to account.
In the workshop, we will discuss examples of how using personal data and AI can lead to unfair outcomes, assess the role of design in identifying and mitigating these harms and share best-practice approaches for creating AI driven user experiences that are fair by design.


What is the goal and/or outcome of your session?:

The ICO is looking to develop practical support to help product teams create AI services and experiences that mitigate discriminatory harms resulting from personal data use. We wish to engage the design and tech community to understand key challenges they face in identifying and mitigating bias and discrimination in AI, it's impact this has on product design decisions and practices, and learn what further support or clarification product teams might need.
We also want to raise awareness in the design community of approaches in service and UX design approaches that prevent discriminatory harms in AI products and services.

Why did you choose that space? How does your session align with the space description?:

The session looks real-life examples of when bias and discrimination in AI systems has occurred and what practical measures teams can take to identify and prevent these harms throughout the design and development process. The workshop will take discussions of fairness beyond abstract principles to concrete examples that focus on the design of AI interfaces and the tools designers give people to understand and object to unfair automated decisions.

How will you deal with varying numbers of participants in your session? What if 30 participants attend? What if there are 3?:

We will have a flexible approach to running the workshop to cope with varying numbers. We will have several facilitators on hand to run discussions in breakout rooms if we get a large number of attendees.

What happens after MozFest? We're hoping that many efforts and discussions will continue after MozFest. Share any ideas you already have for how to continue the work from your session.:

The ICO are developing guidance to help practitioners understand and comply with the Fairness principle in the UK General Data Protection Regulation when using personal data in AI systems. The outcomes of the festival will feed into the ICO’s ongoing work to develop practical support for product teams to consider data protection, privacy and fairness by design.

What language would you like to host your session in?:

English