Over the past 18 months, The Imprint has published at least 20 stories on the merger of big data and child protection.
Other outlets, including Forbes and Bloomberg, have also covered the trend. The latest national outlet to do so is CNBC, whose Dina Gusovsky published a thoroughly reported piece today.
Please read and watch the article and video HERE.
The CNBC piece centers on Los Angeles County’s exploration of a predictive analytics tool, which would help the county’s Department of Children and Family Services better determine which children were at heightened risk of subsequent abuse.
That tool, dubbed AURA, was developed by SAS, the world’s largest private software firm. Using a mix of data including, but not limited to, prior child abuse referrals, involvement with law enforcement, as well as mental health records and alcohol and substance abuse history, SAS statisticians created a risk score from one to 1,000, wherein high numbers demark high risk.
While the idea of using big data to ascribe risk is alluring to child welfare administrations struggling to assess risk through more imprecise methods, the specter of “prediction” does not sit well with everyone.
The central tension explored in the CNBC piece, and in many that we published in The Imprint, is the fear that data analytics could spawn – in the worst case – racial profiling.
In July of last year, I moderated a panel put on by Los Angeles County’s Office of Child Protection. Among the guests was Emily Putnam-Hornstein, a researcher at the University of Southern California’s School of Social Work.
Putnam-Hornstein, who is part of a team that is developing a data analytics tool for the child welfare administration in Allegheny County, Pennsylvania, pointed out that such tools could also cut the other way.
While they can flag certain families for intensified scrutiny, they can also show families that are at a lowered risk.
So, if a child abuse investigator walks into a house and assumes a child is in a very risky situation, the tool could spit out a lower risk score, which could prompt the investigator to re-evaluate his or her hunch.
While there is a fear that predictive analytics could bring more children into the foster care system, this scenario paints a picture of how it could actually keep them out.
The field of child welfare is still in the very early stages of experimenting with predictive analytics, so we will have to wait to see how good or bad this application of technology will be.
We would love to know your thoughts on the issue. If you want to submit a commentary of 600 words we would be happy to consider it for publication.
Submissions can be sent to: [email protected]