Fair Decision Making Using Privacy-Protected Data
Data collected about individuals is regularly used to make decisions that impact those same individuals. We consider settings where sensitive personal data is used to decide who will receive resources or benefits. While it is well known that there is a trade-off between protecting privacy and the accuracy of decisions, we initiate a first-of-its-kind study into the impact of formally private mechanisms (based on differential privacy) on fair and equitable decision-making. We empirically investigate novel tradeoffs on tworeal-world decisions made using U.S. Census data (allocation offederal funds and assignment of voting rights benefits) as well as a classic apportionment problem.Our results show that if decisions are made using anε-differentially private version of the data, under strict privacy constraints (smallerε), the noise added to achieve privacy may disproportionately impact some groups over others. We propose novel measures of fairness in the context of randomized differentially private algorithms and identify a range of causes of outcome disparities. We also explore improved algorithms to remedy the unfairness observed.