Workshop on Privilege and Ethics in Data
04-19, 11:40–13:10 (Europe/Berlin), A05-A06

Data-driven products are becoming more and more ubiquitous. Humans build data-driven products. Humans are intrinsically biased. This bias goes into the data-driven products, confirming and amplifying the original bias. In this tutorial, you will learn how to identify your own -often unperceived- biases and reflect on and discuss the consequences of unchecked biases in Data Products.


Data-driven products are becoming more and more ubiquitous across industries. Data-driven products are built by humans. Humans are intrinsically biased. This bias goes into the data-driven products, which then amplify the original bias. This has the consequence that the power imbalances in a data-driven world tend to get bigger instead of smaller, most of the time unintentionally, and is particularly prevalent in the tech sector where teams are not diverse.

One of the obvious solutions is to get diverse teams, but when considering all the intersections of diversity, achieving full diversity is practically an impossible task. Therefore we see education and awareness as foundational steps to working towards a more equitable data world.

This tutorial has two parts. In the first exercise, we will start by revisiting our own privileges, as a tool to better educate ourselves in order to identify our individual - often unperceived - biases.
In the second part, we will evaluate what happens when these biases happen on a group level and go unchecked into our data products, based on the Data Feminism book and enriched with our own experiences as data professionals.

Education about Privilege and Ethics in the data-driven world can only improve how we see and work with data and better understand how our work with data can affect others.


Abstract as a tweet

Data-driven Products are built by humans. Humans are intrinsically biased. This bias goes into the products, which amplifies the original bias. In this tutorial, you will learn how to identify your biases and reflect on the consequences of unchecked biases in Data Products.

Public link to supporting material

https://miro.com/miroverse/ethics-in-data-science-workshop/?social=copy-link

Expected audience expertise: Python

None

Expected audience expertise: Domain

None

This speaker also appears in:

Paula's love for working with Data brought her to the field of Data Science from her initial Natural Sciences background. Loving Teaching brought her to work as a Data Science Coach.Enjoying public speaking brought her to the PyData community - where she enjoys sharing and learning, and most recently organizing. Now she's working as Head of Data Science at Spiced Academy, although technically she's still on parental leave.