Oregon state child welfare officials are replacing an algorithm used to help determine which families should be visited by child protective services with a new screening process.
The decision comes after the Associated Press reviewed a different algorithmic tool in Pennsylvania — the inspiration for Oregon’s algorithm — and found Pennsylvania’s model flagged a disproportionate number of African American families for “mandatory” investigations for child abuse or neglect when the model was first implemented.
It was announced to Oregon Department of Human Services staff via email last month that after “extensive analysis” the hotline workers would cease using the algorithm by the end of June, to reduce inequities regarding exactly which families are investigated by child protective services for neglect and abuse.
Department Spokesman Jake Sunderland told AP the current algorithm can’t be used with the new screening process the state is using, and would therefore “no longer be necessary.” He did not provide additional details on why Oregon was replacing the algorithm or on any corresponding disparities that had an effect on the changes. According to AP, the algorithm will be replaced by another model that other child welfare agencies use nationwide.
The state’s Safety at Screening Tool, implemented in 2018, was designed to predict the level of risk children face of a future CPS investigation or placement in foster care. Social workers view the risk scores, measured numerically, and officials determine whether a different social worker should investigate the family’s situation. Oregon officials attempted to address the racial bias in its design with a “fairness correction” and adjusted the initial algorithm model to draw exclusively from the child welfare system’s internal data to calculate risk.
Child welfare agencies nationwide have considered using or are already using algorithms, while critics have concerns about reliability, transparency and racial disparities.
“A future predicted by today’s algorithms is predetermined to correspond to past inequalities,” said Dorothy Roberts, author and University of Pennsylvania professor, during a lecture last fall.
U.S. Sen. Ron Wyden (D) of Oregon said he had reached out to the state department after the publication of the AP story to question the racial bias in the algorithm, which was a primary concern in the growing use of artificial intelligence in child welfare services. Wyden is the chief sponsor of the Algorithmic Accountability Act of 2022, a bill that would establish transparency and national oversight of automated systems including algorithms and software.
“Making decisions about what should happen to children and families is far too important a task to give untested algorithms,” Wyden said in a statement. “I’m glad the Oregon Department of Human Services is taking the concerns I raised about racial bias seriously and is pausing the use of its screening tool.”