Abstract
Predictive algorithms are now commonly used to distribute society’s resources and sanctions. But these algorithms can entrench and exacerbate inequities. To guard against this possibility, many have suggested that algorithms be subject to formal fairness constraints. Here we argue, however, that popular constraints—while intuitively appealing—often worsen outcomes for individuals in marginalized groups, and can even leave all groups worse off. We outline a more holistic path forward for improving the equity of algorithmically guided decisions.
Original language | English (US) |
---|---|
Pages (from-to) | 601-610 |
Number of pages | 10 |
Journal | Nature Computational Science |
Volume | 3 |
Issue number | 7 |
DOIs | |
State | Published - Jul 2023 |
ASJC Scopus subject areas
- Computer Science (miscellaneous)
- Computer Science Applications
- Computer Networks and Communications