Posts Under: bias

Measures of Algorithmic Fairness Move Beyond Predictive Parity to Focus on Disparate Error Rates

There’s good news from the scholarly community working on the assessment of fairness in algorithms.  Computer scientists and statisticians are developing a host of new measures of fairness aimed at providing companies, policymakers, advocates with new tools to assess fairness in different contexts. The essential insight of the movement is that the field needs many different measures of fairness to capture the variety of normative concepts that are used in different business and legal contexts. Alexandra Chouldechova, Assistant Professor of Statistics and Public Policy at Heinz College, says “There is no single notion of fairness that will work for every decision context or for every goal.” To find the right measure for the job at hand, she advises, “Start with the context in which you’re going to apply [your decision], and work backwards from there.” This issue came to a head in the controversy surrounding the COMPAS score.  This s ...