Family Surveillance by Algorithm: The Rapidly Spreading Tools Few Have Heard Of


Family Surveillance by Algorithm: The Rapidly Spreading Tools Few Have Heard Of

The child welfare system uses algorithms, risk assessment and predictive analytics to monitor families and determine whether children are at risk of maltreatment.

Implicit bias color blinding New York


Top Stories of 2019: Race and Bias in New York

We’re counting down 10 of the biggest stories The Imprint published in 2019. Each day, we’ll connect readers with a few links to our coverage on a big story from this past year.


Mi Hermana’s Keeper Toolkit: Uplifting Latina Voice

Latina youth who are associated with the j­­­­­­­uvenile justice system are high risk for trauma compared to those who are non-system involved. Therefore, with the percentage of Latina youth on the rise nationally and an increasing number of those who are system-involved, attention must be devoted to understanding their experiences and addressing their needs.


Report: How the Broken Criminal Justice System Fails LGBTQ Youth

A report released by the Center for American Progress and the Movement Advancement Project explores the profiling, bias and maltreatment of lesbian, gay, bisexual, transgender, and queer (LGBTQ) youth in the juvenile-justice system.

Youth Services Insider
Blogger Co-op


ProPublica Exposes Racial Bias in Predictive Analytics

A recent story from the nonprofit in-depth journalism site ProPublica quotes a warning issued in 2014 by then-Attorney General Eric Holder to the U.S. Sentencing Commission. His warning concerned a fad spreading through the criminal justice system.


Research on Reasoning and Bias in Child Welfare Decision-Making

By Ankita Mohanty As a poor person of color, are you more likely to have your child taken away from you than your wealthier white counterpart? Research out of Israel published earlier this year points to something long understood in American child welfare: race matters.