Algorithmic Injustice: Bias in Machine Learning and Criminal Justice

This is a past event

18 people went

Location image of event venue

Details

Biases inherent in data can cause the behavior of machine learning algorithms to discriminate against certain populations. Machine learning algorithms are used every day by judges and other decision-makers in the criminal justice system to decide bail requirements, criminal sentencing, and parole recommendations. James Foulds, Assistant Professor of Information Systems at UMBC, has developed an approach to demonstrating the bias in current systems and presents a path forward to more equitable tools through application of his concept of “differential fairness.” Sonia Kumar, Senior Staff Attorney at the ACLU of Maryland, represents clients throughout the region who are challenged with confronting unjust outcomes based on the perceived “objectiveness” of predictive software. Please come out to participate with James and Sonia in a discussion about the legal and technical challenges of machine learning algorithms and potential solutions to what is a fundamental issue for the future of the criminal justice system.