Searching for “professor” or “CEO” on Google images, the results show overwhelmingly white male pictures. While these jobs are held by white male professionals more often, the image search results present an extreme bias against representing women and people of color. This has been pointed out as an ethical problem in various outlets; however the problem persists.
In this workshop, we use this case as an example on how to structure the ethical problem at hand and its underlying principles before moving on to try solving it. Through the game-like structure of the Mapping method, the workshop will engage participants and help them develop essential tools to decide on ethical solutions that are technically feasible. Collaborating with each other, participants test the strength of their ideas and progress gradually towards creating solutions to this real-life problem as well as analyzing how their solution would hold up in other relevant cases such as voice assistant responses and other search result categories. The Mapping helps bring abstract ethical arguments to the ground—in a very literal sense, since the Mapping takes the form of a physical ground game.
Who can attend:
Anyone who has an interest in the ethical problems that we encounter in technology and who wants to solve these problems both in ethically and technically feasible ways. No prior knowledge or expertise in the area is required. We welcome all backgrounds and especially encourage practitioners in AI and those who come from relevant fields, such as philosophy, computer science, data science, design, law, and psychology to join us!
For more information: http://aiethicslab.com/the-mapping/
“The Democracy Center is partially wheelchair accessible. The Mandela room, where we are holding the event is accessible but there is no accessible bathroom on site (however, participants are welcomed to use the accessible restroom next door at Daedalus). Contact [masked] or[masked] as needed.”