Los Angeles County’s recently formed Office of Child Protection facilitated, in the words of its interim director, an “uncomfortable” but “courageous discussion” on the uses and implications of data, analytics and risk modeling for child welfare during a community forum on Wednesday in Los Angeles.
Leaders from Los Angeles County administrative departments and community based-agencies learned more about how “big data” could become an instrumental part of assessing the risk and safety of children in the county. They learned how data can be analyzed to determine how likely a given family is to enter the child protective services system, perhaps as early as a child’s birth. These kinds of risk-modeling tools are being developed in Allegheny County in Pennsylvania and in New Zealand, and public officials are looking into such a tool in L.A. County.
Among the panelists were Emily Putnam-Hornstein, an associate professor at the University of Southern California’s School of Social Work who is consulting on Allegheny County’s risk-modeling project; Rhema Vaithianathan, a professor at the Auckland University of Technology and the lead researcher developing New Zealand’s predictive risk-modeling tool; and Jennie Feria, an executive assistant to Director Phillip Browning at L.A. County Department of Children and Family Services. Feria is leading the analysis of a risk-modeling tool, Approach to Understanding Risk Assessment (AURA), at DCFS.
“We want to have what could be at times an uncomfortable conversation because there are implications about race and about the potential for analytics to exacerbate disproportionality,” said Fesia Davenport, interim director of the Office of Child Protection.
Attendees of the forum, which was held at the University of Southern California, acknowledged that technological advancements are necessary, but at the same time, many people asserted that any new technological tool for risk assessment needs to be planned and implemented with great care to avoid “profiling” and using a parent’s history to predict their future actions.
Putnam-Hornstein’s research with the Children’s Data Network at USC in 2014 identified four factors correlated with reported and substantiated child abuse and neglect. Families in which children were born without fathers listed on their birth certificates; mothers were on public health insurance; mothers had not completed high school; mothers were teenagers—these were all connected with risk.
When these factors were identified during the forum, the question arose: Do all of these risk factors comprise a description of poverty? If a tool were developed with such risk factors in mind, would it direct more attention and scrutiny toward poor families of color and lead to more children in this population from being removed from their homes?
African American children are already overrepresented in the foster care system in California and most other states across the nation.
Putnam-Hornstein noted that biases already come into play when social workers make subjective decisions about families. A risk modeling tool should aim to decrease bias, not increase it, she said.
“We don’t want to develop a model that tells us nothing more than how the system is operating today, which may be biased in how the system decides to open cases or who gets placed in foster care,” Putnam Hornstein said. “Because then we’re developing a model where we are predicting an outcome that was already driven by the system.”
Dr. Matt Harris, executive director of Project Impact and “co-convener” of the Los Angeles County Community Child Welfare Coalition, a 50-member alliance of youth service providers, asked how a family’s strengths would be included in a risk-modeling tool.
“How will strengths in a family play into the predictive factors conversation?” Harris asked. “For example, if there’s a history of substance abuse and the now person is now in recovery, will the strengths of recovery weigh equally with the risk of history?”
Feria said that a sophisticated model should be able to consider a family’s current situation and its engagement with services that are protective. However, administrative data’s effectiveness will always be limited, she said.
“Ultimately, it’s going to have to come down to clinical judgment,” Feria said. “It’s going to have to be that engagement of a worker with a family where they can do a deep dive as to what types of social supports are available, what else is going on in the family, what assets are present?”
Feria said the tool could help social workers prioritize cases.
“At a very early stage, it may help us sort through some of the noise, so that we can be more intentional in directing some of our initial efforts and resources,” Feria said.
Harris, in an interview, expressed his concern that in the African-American community, if a new tool is created to describe that community, and the people in that community did not participate in the tool’s creation, it is reminiscent of “colonialism” and will cause consternation.
“You’re going to get tremendous pushback especially when the people who come in aren’t in relationships with the community, haven’t struggled with the community,” Harris said. “That sort of parachuting-in mentality doesn’t work and it puts us on guard.”
Elvia Torres, executive director of SPIRITT Family Services and a member of the county-wide coalition, said that DCFS’ history contributes to fear about a new tool.
“The fear is profiling and the track record that DCFS has in our community,” Torres said. “That’s where the fear comes from. It’s the fear of the unknown.”
DCFS is still in the early stages of the AURA project. Feria said the request for proposals for an analytics tool will be a two year-process.
Davenport said that as DCFS progresses in the process, the Office of Child Protection hopes to schedule another community meeting to further discuss analytics.