Skip to content

Details

What if AI bias isn’t simply a technical flaw—but a reflection of the systems and structures behind the data we use?

Join us in Calgary for an engaging discussion on how AI can unintentionally reinforce systemic inequities—and how we can build more responsible, community-centered technologies.

Christian “ZacaTechO” Ortiz, an Afro-Indigenous decolonial social scientist, technologist, and creator of Justice AI GPT, will share insights from his work advancing data sovereignty, accountability, and community-driven innovation in AI.

This session will introduce Justice AI GPT and the Decolonial Intelligence Algorithmic (DIA) Framework—a practical model for designing AI collaboratively with communities rather than merely for them.
Rather than treating bias as an issue to fix after deployment, the DIA Framework focuses on:
• Examining power structures within AI systems
• Prioritizing intersectional community impact
• Embedding community-led governance
• Reimagining data sourcing, labeling, evaluation, and deployment practices

The session will also feature a real-world “before and after” case study demonstrating how bias can emerge in AI outputs—and how applying a DIA-based approach can lead to more equitable and meaningful outcomes.
This event is ideal for:
→ Data for Good volunteers
→ Data scientists, analysts, and technologists
→ Community organizers and nonprofit leaders
→ Anyone passionate about ethical and inclusive AI

Come connect, learn, and join the conversation about building AI systems that work more equitably for everyone.

Related topics

You may also like