The AP has a cautionary tale about the biases that can be baked into artificial intelligence tools.
The story comes out of Pennsylvania, where parents with developmental disabilities had their baby taken away from them and put into foster care. “They wonder if an artificial intelligence tool that the Allegheny County Department of Human Services uses to predict which children could be at risk of harm singled them out because of their disabilities,” the AP writes.
The U.S. Justice Department is now asking the same question. The AP did a deep dive into the case, laying out the issues with the algorithm and explaining the family’s story. Read it here.
Links to More Stories on Disability Bias in AI
Disability Bias Should Be Addressed in AI Rules, Advocates Say (Bloomberg Law)
Common AI language models show bias against people with disabilities: study (The Hill)
Photo image: Pixabay