FLINTA Talk Night: Towards More Responsible AI


Details
Due to the BVG strike, we will start the event later at 19:00. Please don't worry about being late and join us if you can!
You’ve probably heard a lot about AI—how it’s changing industries, making decisions, and even shaping our daily lives. But did you know that AI can also be biased? Join us for two insightfult talks about bias in AI and tools like Sweet Summer Child Score that can help mitigate the harm.
**IMPORTANT**
Please sign-up via Luma on Ape Unit's event page: https://lu.ma/e636qkuc?utm_source=eit
✍️ Event description
This meetup is hosted by Empowered in Tech together with Ape Unit. Our goal for this event is to make the topic of bias in AI more accessible to everyone—regardless of background or expertise. We’ll explore what AI bias is, the real-world impact it has, and what we can do to reduce its harm. You don’t need to be an expert or work in tech—all perspectives are welcome!
Talk: Sweet Summer Child Score
Sweet Summer Child Score is an open-source library to identify potential AI harms. A truism in tech is that we're good at asking "can we do it", but not "should we do it". Attempting to tackle the latter, this library offers a system scan to quickly identify potential harms, and build the capability of relative risk assessment.
SSCS does not explore the specifics of your stack or technical implementation -- instead it takes a step back to look at the ecosystem your technology will be deployed in, and the implementation choices which define the seam between your system and the broader world. Put simply, this is an attempt to see the forest, not the trees.
Links:
The project and GitHub repos are online at https://summerchild.dev
[SECOND TALK CANCELLED:
Talk: Exploring fairlearn and practical strategies for assessing and mitigating harm in AI systems
As AI becomes a more significant part of our everyday lives, ensuring these systems are fair is more important than ever. In this session, we’ll discuss how to define fairness and the potential harms our algorithms can have on people and society. We’ll introduce fairlearn, a community-driven, open-source project that offers practical tools for assessing and mitigating harm in AI systems. We’ll also explore how to discuss bias, different types of harm, the idea of group fairness and how they all relate to fairlearn’s toolkit. To make it all concrete, we’ll walk through a real-world example of assessing fairness and share some hands-on strategies you can use to mitigate harm in your own ML projects.
Links*:*
- website: https://fairlearn.org/
- repository: https://github.com/fairlearn/fairlearn
- LinkedIn: https://www.linkedin.com/company/fairlearn/]
🏢 Location
We are kindly hosted by Ape Unit.
📝 Sign Up
This time the sign-up is handled by our host Ape Unit via Luma: https://lu.ma/e636qkuc?utm_source=eit
⏰ Agenda
18:30 Doors open, we'll mingle and have some snacks
18:45 Intro and welcome from your hosts
19:00 Talk by Laura: Sweet Summer Child Score (+ Q&A)
19:45 Time for Networking
21: 15 Doors close
🗣 Speaker
Laura Summers
Laura is a very technical designer™️, working at Pydantic as Lead Design Engineer. Her side projects include Sweet Summer Child Score (summerchild.dev) and Ethics Litmus Tests (ethical-litmus.site). Laura is passionate about feminism, digital rights and designing for privacy. She speaks, writes and runs workshops at the intersection of design and technology.
👋 About Us
Stay in touch with the community! Join our Slack:
https://bit.ly/EmpoweredInTechSlack
We are a local community in Berlin dedicated to empowering FLINTA (women, lesbians, intersex, non-binary, trans and agender) people to excel in their tech journey. Our events offer study groups, technical workshops, hackathons, networking events, panel discussions, lightning talks, and social events.
☂️ About the FLINTA label
This event is labeled as FLINTA (women, lesbians, intersex, non-binary, trans and agender) as we make it a mission to empower women and other underrepresented minorities in the tech community. That being said, we don't exclude anybody!
Cis men are welcome to join 😊, just bear in mind that the topics will be discussed from this perspective.
💗 Code of Conduct
We are dedicated to providing a safe and welcoming experience for everyone who participates in our events.
Our events aim to empower diverse women and we welcome everyone who identifies as a woman or another underrepresented group in tech or an ally.
We follow the Berlin Code of Conduct for our events: https://berlincodeofconduct.org/
📸 Media Consent
We may be taking photos of this event for social media posts.
If you do not want to be photographed, please let the organizers at the event know. We will make sure to respect your privacy.

FLINTA Talk Night: Towards More Responsible AI