Virginia's Eubanks discussed Indiana's failure in the Automating Inequality system, and she wrote a book on how technology affects citizens, human rights, and economic equity. Eubanks explains that algorithms make machines make decisions for us on difficult social issues rather than making our own choices, which has a "emotional distance".
"We can not use algorithms to avoid making difficult decisions or shirking our responsibility to care about others. In these cases, the algorithm is not the answer. Mathematics alone can not solve deep-rooted social problems, and attempts to rely on it will only exacerbate inequalities already existing in the system. "
Before deciding whether to automate, we need to look carefully at the people that the system may affect and determine what the impact may be and determine the inequities that exist in the current system.
Can the available data really bring good results?
The algorithm depends on the input data, and the correct data is needed to ensure normal operation. Before implementing a decision system that relies on a dependency algorithm, the organization needs to study the problems they are trying to solve, and make some honest thinking about whether they have the data to solve the problem.
Another case is discussed in the Eubanks book. The children, youth and family (CYF) department in algney, Pennsylvania, implements an algorithm for each report giving a "threat score" to children in the agency's potential child abuse event and helping the case worker decide which reports should be investigated. The goal of the algorithm is common: to help social service organizations to make the most effective use of limited resources to provide community services.
In order to achieve their goals, the county tries to predict which children may become victims of abuse, that is, "target variables". However, the county did not have enough data on child abuse or approximate mortality to build a statistically significant model, so they used two variables with sufficient data - the number of hotlines for the community rereferral of CYF and the number of foster children within two years as a substitute for child abuse. This means that the algorithm in the county can predict the possibility of child referral and foster care, and use these prediction results to score children's maltreatment threat.
The problem is obvious.
These surrogate variables are not an effective substitute for child abuse data.
First of all, they are subjective. The re referral of this variable contained a hidden bias: "anonymity and the compulsive journalists report blacks and mixed - blood families abusing children and ignoring children more than three and a half times more than the white families they reported."
Sometimes even the murderous neighbours, landlords, or family members deliberately false reports are punished or retaliated, as Eubanks wrote in automated inequality, "the prediction model needs clear and clear steps, and there is a lot of relevant data to carry it right." These standards have not yet been fully met in algney county. Anyway, CYF promoted and implemented an algorithm.
What results does this algorithm with limited precision lead to?
There were 15139 reports of child abuse in 2016. Among them, the algorithm error forecast 3633 cases. This result has unprovoked invasion and monitoring of the lives of thousands of poor minority families.