The elusiveness of ethics: encoding fairness in an unfair world


Details
The elusiveness of ethics: encoding fairness in an unfair world
While the problems of data bias and algorithmic bias have hit the mainstream, the mathematical approaches to tackle this problem are mostly restricted to a rarified academic discourse. In this talk you'll get a birds-eye view of the state of ML Fairness, from checklists and frameworks to developer tools and mathematical definitions of bias and fairness. We'll wrestle with the elephant in the room: How do we encode fairness into our models when we can't precisely define our ethics as a society, a team, or sometimes even as individuals?
Laura Summers is a multi-disciplinary designer researching technology ethics and building tools to promote fair machine learning. She's passionate about feminism, digital rights and designing for privacy. Laura is the human behind fairxiv.org, the Melbourne Fair ML reading group (groups.io/g/fair-ml) and she's now working on debias.ai. She speaks, writes and runs workshops at the intersection of design and technology.

The elusiveness of ethics: encoding fairness in an unfair world