Predictive Policing: Bias In, Bias Out

Data & Society - En podkast av Data & Society

Kristian Lum will elaborate on the concept of “bias in, bias out” in machine learning with a simple, non-technical example. She will then demonstrate how applying machine learning to police records can result in the over-policing of historically over-policed communities. Using a case study from Oakland, CA, she will show one specific case of how predictive policing not only perpetuates the biases that were previously encoded in the police data, but – under some circumstances – actually amplifies those biases.

Visit the podcast's native language site