Crunching numbers to help predict human behavior is common practice in insurance, banking, and public policy. We are always looking for the perfect algorithm to help improve decision-making. But when those decisions involve the fate of families, and the potential removal of kids from their parents, data-driven predictions become the subject of intense debate. That was on vivid display last Tuesday at a panel in New York City on the use of so-called predictive analytics for investigating claims of child abuse and neglect.
Andrew White, the Deputy Commissioner for the city’s Administration for Children’s Services, distinguished his agency’s cautious embrace of predictive data from other methods being deployed by large cities nationwide, and indicated that his agency was interested in expanding its use of the tool.
“We are committed to using predictive analytics as carefully, ethically, and thoughtfully as we can,” said White, who began the session sympathizing with concerns from advocates for parents and children over how children are removed from black families at a vastly higher rate than from white families. “We need to make sure that whatever we are doing [with predictive analytics] does not worsen that disproportionality, and ideally improve it.”
Prominent opponents of predictive analytics shared the stage with White at the New School’s Center for New York City Affairs, arguing that such tools could produce false positives, reduce transparency and accountability, or worsen racial and economic biases in child welfare.
“If a child is taken out of family based on one of these scores, is there recourse? Is there a way you can understand how that tool worked? If it makes decisions that have poor impacts on your families, is there a remedy for that?” asked Virginia Eubanks, a political science professor at the University of Albany who just released a pessimistic book on the use of predictive analytics, called Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.
Joyce McMillan, executive director of the Child Welfare Organizing Project (CWOP), went further, hammering ACS specifically and comparing its predicative analytics work to New York City’s controversial stop-and-frisk policing policy, which was found to be an unconstitutional form of racial profiling in a 2013 court ruling.
“I don’t believe ethics can be created with predictive analytics … What I know of predictive analytics makes me afraid,” she said, criticizing White’s earlier assurances about the agency’s elaborate ethics guidelines.
In general, child welfare agencies use predictive analytics to either evaluate their programs or to make decisions about how to handle individual family’s cases. The predictive models house a large, often-undisclosed array of data about past cases, against which specific information about families with open investigations gets compared. The model then spits out a number indicating the potential for future harm to a child.
According to a list of variable categories used in one of ACS’ models, the agency’s data tool could consider upwards of 280 variables, including things like prior allegations of child abuse or neglect in the household, as well as the age of the mother, number of siblings, and location. ACS has not indicated how they weight these variables.
The event last week at the New School was co-hosted by CWOP, a Harlem-based nonprofit parent advocacy group that is one of ACS’ most vocal critics, and moderated by Martha Raimon of the Center for the Study of Social Policy.
Richard Wexler, another panelist and the executive director of the National Coalition for Child Protection Reform, went furthest criticizing ACS, calling predictive analytics the “nuclear weapon of child welfare.”
“When we talk about nuclear weapons, we don’t talk about ‘well, let’s find a way to use them ethically, to properly use them,’” Wexler said. “We talk about disarmament. So the response from proponents boils down to endless promises that they can control their nukes.”
ACS’ White pointed out that his agency had so far only used analytics to evaluate caseworker and foster care agency performance. But, he suggested, the tool could be applied to evaluate families accused of child abuse, which White said account for about the six or seven percent of the 60,000 annual investigations ACS conducts. Said White:
“Our analytic team created a predictive tool that does a pretty excellent job predicting the likelihood that a child will experience severe physical harm or sexual abuse at some point in the two years following the start of an investigation. … we find that the model steers us to a very high percentage of these children.”
He stressed that frontline caseworkers will not have access to the tool, and said it would mainly be used to help point families towards the right support services—not to make decisions about removing children.
That did not persuade his co-panelists.
“Proponents say the model is successful if it flags cases that are more likely to result in substantiation or more likely to result in placement in foster care. But if a family is poor, and the poverty is confused with neglect, and a year later the family is still poor, of course there’s a good chance the poverty will be confused with neglect again,” said Wexler, whose organization releases frequent polemics in opposition to policies or programs that might increase the number of children placed in foster care.
“This isn’t prediction, this is self-fulfilling prophecy,” he added.
Eubanks did compare New York’s analytics program favorably to the child welfare agency in Allegheny County, Pennsylvania, which she embedded with over nine months in 2015 and 2016 for her book. For example, unlike New York, Allegheny County put the tool in the hands of call screeners who take calls to the child maltreatment hotline, to help them decide whether or not to escalate investigations. Still, her book and her remarks amounted to a sweeping indictment of the use of analytics in general, highlighting concerns she’s heard from activists.
An independent audit of ACS released by the corporate investigative firm Kroll Associates in December of last year—ordered by Governor Cuomo’s Administration in response to a high-profile child death in New York City—praised ACS for its use of predictive analytics. The influential child welfare advocacy and philanthropy group, Casey Family Programs, has also praised ACS for its data practices.
But most of the panelists agreed those tools falling into the wrong hands might be their biggest concern.
“Even if you think [current ACS commissioner David Hansell] would stand up to that pressure,” in the case of a high-profile child death, “what about the next Commissioner? And even if you think [Mayor Bill de Blasio] wouldn’t cave—and I think that’s a bit of stretch—Would you want [New York City Comptroller] Scott Stringer to have the nuclear codes? What about a mayor like Rudolph Giuliani?” said Wexler.
Stringer has aggressively criticized ACS and De Blasio for their performance related to child deaths, and is considered a top contender for the city’s next mayoral race.
Days before the CWOP event, a bill was introduced in the United States Senate to set aside dedicated funding for state child welfare departments to experiment with predictive analytics, underscoring the national stakes for the event in New York, which is home to one of the largest foster care systems in the country.