by Rob Taylor Jr., Courier Staff Writer
A new task force has been created in Pittsburgh, and it’s going after possible bias in something that’s seldom seen or heard—computer algorithms.
In Allegheny County, algorithms are used to determine risk assessments for child welfare services and where to deploy police officers, according to a release from the University of Pittsburgh.
That’s all fine and dandy, provided that, in the university’s eyes, the algorithms aren’t spewing out biased results that could negatively affect African Americans, other minorities, or those who are classified as low-income.
Enter the Pittsburgh Task Force on Public Algorithms, sponsored by Pitt’s Institute for Cyber Law, Policy and Security, commonly known as Pitt Cyber. The university announced on Jan. 22, the creation of the 22-member task force, which is composed of researchers, educators and community advocates. Some of the African Americans on the task force include Urban League of Greater Pittsburgh President and CEO Esther L. Bush, A Second Chance Inc., Founder, President and CEO Sharon McDaniel, former Deputy Chief (Education) for the City of Pittsburgh mayor’s office LaTrenda Sherrill, and Pitt Professor and Senior Pastor of Bible Center Church John Wallace.
The task force is also served by a Government Advisory Panel, featuring, among others, County Chief Public Defender Matthew Dugan, city police crime analysis manager Heath Johnson, and Pittsburgh community affairs assistant director Shatara Murphy.
Algorithms are created to make swift and “neutral” decisions that are more informed and consistent than those of humans. But if an algorithm’s code reflects the biases of its human creator, or built using data from biased practices, it can, in effect, produce biased results, the Pitt release said.
Currently, Allegheny County has been both praised and denounced for its use of algorithms for the “Allegheny Family Screening Tool,” which aims to help county child protection staff make better decisions on who the most at-risk children are. According to the Pittsburgh Post-Gazette, the computer tool uses more than 100 variables, such as previous child welfare involvement and previous criminal justice system involvement, to generate a score from one to 20. Twenty is the highest risk that predicts the likelihood of a re-referral or home removal for a child.
And nationally, more computer-based risk assessment tools are being utilized in the criminal justice system to predict recidivism and bail violations, as reported in The Seattle Times. “These predictions are used by judges to inform sentencing. But these tools are often based on fundamentally flawed assumptions, such as equating arrest patterns to crime patterns, that bail violations indicate flight risk rather than access to transportation or other socioeconomic challenges. Moreover, these tools have been seen to be deployed without independent external review, without community involvement and sometimes even without rigorous evidence that their predictions are accurate,” The Seattle Times article from Feb. 2019 reported.
The task force’s primary goal is to establish best practices and practical guidelines for the use of municipal algorithms.
“Increasingly, algorithms are being used to facilitate efficient government. We need to ensure that historical discrimination and existing inequities are not reinforced,” said Pitt Cyber Founding Director and Task Force Chair David Hickton, in a statement to the Courier. “Pittsburgh should lead the way in effective and fair oversight of these systems. We can be a national model, ensuring algorithmic accountability and equity for all residents.”
Pitt said the task force will seek area residents’ thoughts on the use of municipal algorithms via public meetings to be held March 10 from 5:30 to 7:30 p.m. at the Carnegie Library’s Homewood branch, and March 19 from 5:30 to 8 p.m. in collaboration with the Beltzhoover Consensus Group’s meeting, at 900 Delmont Ave.
The task force will produce a full report, including research findings and recommendations from the community, in the summer of 2021.
“Particularly as governments expand the use of these powerful tools, with so much at stake for our liberty, the fight for algorithmic fairness and accountability is another frontier in the struggle for civil rights,” Bush, a task force member, said in a statement provided to the Courier.
Sherrill, another task force member, said in a statement that it’s “critical for the task force to hear from residents whose lives are being impacted by decision-making algorithms. That feedback will help us to draft recommendations for oversight that directly address issues that have been deemed priorities.”
The Heinz Endowments is supporting the work of the task force. Its Chief Equity Officer, Carmen Anderson, said in a statement that with Pittsburgh developing into a leader of research and computer technology, “it is imperative that we simultaneously develop a set of ethics, policies and procedures informed by people who will be impacted by these technologies. It’s particularly crucial as algorithms are used by complex systems with histories of racism and bias such as the criminal justice system.”
(ABOUT THE TOP PHOTO: It’s “critical for the task force to hear from residents whose lives are being impacted by decision-making algorithms. That feedback will help us to draft recommendations for oversight that directly address issues that have been deemed priorities.” LATRENDA SHERRILL,
Member of Pitt Cyber’s new task force aimed at eliminating bias in municipal algorithms)