In response to the death of an 11-year-old boy, Los Angeles County’s Board of Supervisors will discuss a motion Tuesday that calls for a re-evaluation of how the county’s child protection system measures risk.
As reported in The Los Angeles Times, Yonatan Aguilar was found dead in his Echo Park home last month. From 2009 to 2012, the Department of Children and Family Services (DCFS) conducted a handful of investigations of suspected abuse. To determine risk, workers use something called Structured Decision Making, which – like the tool an insurer uses to figure out rates – is a questionnaire that spits out a risk rating. Four times during that span, the department deemed the boy at “high risk,” according to the Times’ reporting.
“The Department of Children and Family Services last interacted with the family in 2012,” the motion drafted by Supervisors Mark Ridley-Thomas and Michael Antonovich reads. “In following existing practice and policy, workers involved in the case may have utilized the ‘Structured Decision Making’ (SDM) tool to determine the level of risk for the child. Evidence suggests that there may be potential shortcomings inherent in the SDM tool which may provide unclear guidance and the ability to override results.”
Ridley-Thomas and Antonovich want the Office of Child Protection to work with DCFS and “report back to the Board in 60 days on the current use and identified weaknesses of the SDM tool; and explore various alternatives, including an examination of the success of Project Aura and the use of predictive analytics for child safety and welfare.”
DCFS Public Affairs Director Armand Montiel said that the department welcomed the motion.
Montiel said that SDM and predictive analytics could help discern which families need more services, but could never tell a worker when and if a child would be the victim of severe or fatal abuse.
“These tools might be helpful, but only in a limited sense,” he said. “The one thing they cannot do is replace a good investigation.”
In 2013, in an effort to help investigators make better decisions, DCFS contracted with SAS, the world’s largest private software firm, to test predictive analytics.
The experiment, dubbed AURA, or Approach to Understanding Risk Assessment, tracked child deaths, near fatalities and “critical incidents” in 2011 and 2012. Using a mix of data including, but not limited to prior child abuse referrals, involvement with law enforcement, as well as mental health records and alcohol and substance abuse history, SAS statisticians created a risk score from one to 1,000, wherein high numbers demark high risk.
The next phase involved applying those risk scores to DCFS referrals in 2013 to gauge if AURA was any good at identifying which kids were most likely to be victims of severe and even deadly abuse.
The Project AURA Final Report, a PowerPoint presentation created by SAS in 2014, stated that if the department had used the tool in 2013 it would have “enabled a significant reduction in the number of tragic outcomes.”
The report went on to explain that if AURA had been used with cases that had a score of 900 or more, it would have flagged “about 11 referrals per day” for “special treatment.” Among the roughly 4,000 flagged reports in 2013 were 171 wherein a child was either killed or nearly killed within six months.
In a 2015 story in this publication, Montiel pointed out that there were some clear costs and benefits to using a tool like AURA.
“This may be true,” Montiel said of SAS’ claim that it identified 171 critical incidents. “But it [AURA] also identified 3,829 ‘false positives’ meaning – if our math is right – 95.6 percent of the time, there was no subsequent critical event. So, we must be very careful not to, in any way, indicate that AURA is a predictive tool.”
The department is yet to implement AURA or any other predictive analytics tools when investigating abuse.
In Los Angeles and throughout the field, the use of big data been in child abuse prediction has been met with serious concerns, including the notion that it would lead to racial profiling and other unintended negative consequences.
At a White House event this summer, Gladys Carrión, New York’s child protection chief, said that predictive analytics “scares the hell out of” her.
“I think about how we are impacting and infringing on people’s civil liberties,” Carrión said during a panel on big data and child welfare. She added that she runs a system “that exclusively serves black and brown children and families … I am concerned about widening the net under the guise that we are going to help them.”
Even so, other jurisdictions, most notably Allegheny County in Pennsylvania, have forged ahead. As of this summer, Allegheny was on the cusp of launching its own predictive analytics tool.
Structured Decision Making has been hotly debated as well. In L.A. County, its misuse was a highly prominent component of a 2012 report that has been influential in driving the county’s reform efforts since.
When 8-year-old Gabriel Fernandez died at his caregivers’ hands in May of 2013, the quality of DCFS investigations came under heavy scrutiny again. Despite six investigations into alleged abuse at the home of the Palmdale boy, the department did not remove him. Like with all child abuse inquiries, SDM was used to determine risk in the Fernandez case.
Given the high stakes context of child death, and the long-running debate around both Structured Decision Making and predictive analytics, the Supervisors’ move strikes at one of the thorniest areas in child protection today. How the Office of Child Protection and DCFS respond will be closely watched.