TY - JOUR
T1 - Bias in smart city governance
T2 - How socio-spatial disparities in 311 complaint behavior impact the fairness of data-driven decisions
AU - Kontokosta, Constantine E.
AU - Hong, Boyeong
N1 - Funding Information:
This material is based upon work supported by the National Science Foundation under award No. 1926470 and by a grant from the John D. and Catherine T. MacArthur Foundation . Any opinions, findings, and conclusions expressed in this paper are those of the authors and do not necessarily reflect the views of any supporting institution. All errors remain the authors.
Publisher Copyright:
© 2020 Elsevier Ltd
PY - 2021/1
Y1 - 2021/1
N2 - Governance and decision-making in “smart” cities increasingly rely on resident-reported data and data-driven methods to improve the efficiency of city operations and planning. However, the issue of bias in these data and the fairness of outcomes in smart cities has received relatively limited attention. This is a troubling and significant omission, as social equity should be a critical aspect of smart cities and needs to be addressed and accounted for in the use of new technologies and data tools. This paper examines bias in resident-reported data by analyzing socio-spatial disparities in ‘311’ complaint behavior in Kansas City, Missouri. We utilize data from detailed 311 reports and a comprehensive resident satisfaction survey, and spatially join these data with code enforcement violations, neighborhood characteristics, and street condition assessments. We introduce a model to identify disparities in resident-government interactions and classify under- and over-reporting neighborhoods based on complaint behavior. Despite greater objective and subjective need, low-income and minority neighborhoods are less likely to report street condition or “nuisance” issues, while prioritizing more serious problems. Our findings form the basis for acknowledging and accounting for data bias in self-reported data, and contribute to the more equitable delivery of city services through bias-aware data-driven processes.
AB - Governance and decision-making in “smart” cities increasingly rely on resident-reported data and data-driven methods to improve the efficiency of city operations and planning. However, the issue of bias in these data and the fairness of outcomes in smart cities has received relatively limited attention. This is a troubling and significant omission, as social equity should be a critical aspect of smart cities and needs to be addressed and accounted for in the use of new technologies and data tools. This paper examines bias in resident-reported data by analyzing socio-spatial disparities in ‘311’ complaint behavior in Kansas City, Missouri. We utilize data from detailed 311 reports and a comprehensive resident satisfaction survey, and spatially join these data with code enforcement violations, neighborhood characteristics, and street condition assessments. We introduce a model to identify disparities in resident-government interactions and classify under- and over-reporting neighborhoods based on complaint behavior. Despite greater objective and subjective need, low-income and minority neighborhoods are less likely to report street condition or “nuisance” issues, while prioritizing more serious problems. Our findings form the basis for acknowledging and accounting for data bias in self-reported data, and contribute to the more equitable delivery of city services through bias-aware data-driven processes.
KW - 311
KW - Complaint reporting
KW - Data bias
KW - Machine learning
KW - Smart city governance
KW - Social equity
UR - http://www.scopus.com/inward/record.url?scp=85094180557&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85094180557&partnerID=8YFLogxK
U2 - 10.1016/j.scs.2020.102503
DO - 10.1016/j.scs.2020.102503
M3 - Article
AN - SCOPUS:85094180557
SN - 2210-6707
VL - 64
JO - Sustainable Cities and Society
JF - Sustainable Cities and Society
M1 - 102503
ER -