Laura Summers
Laura is a very technical designer™️, working at Pydantic as Lead Design Engineer. Her side projects include Sweet Summer Child Score (summerchild.dev) and Ethics Litmus Tests (ethical-litmus.site). Laura is passionate about feminism, digital rights and designing for privacy. She speaks, writes and runs workshops at the intersection of design and technology.
Session
Fairness is fundamentally not tractable to classic optimisation techniques. It's not a state of the world, it's an experience of it. No technology is fair in a vacuum - fairness can only be understood when a technical system collides with humans.
We're seeing a wave of off-the-shelf libraries measuring bad behaviours in LLM outputs, often simplifications of older fairness metrics. They can catch obvious failure modes like slurs. But this is one failure mode among many. Installing a library and calling the job done is fairness washing. The harder, more fruitful approach is to explore the space of failure modes, consider what an ideal world would look like, and design measures, mitigations, and feedback loops accordingly.
This is a talk for people who suspect we can't optimise our way to human dignity.