Who Is Responsible for Biased and Intrusive Algorithms

Algorithms have become part of our everyday lives. Whether one considers jobs, loans, health care, traffic or news feeds, algorithms make several decisions for us. While they often make our lives more efficient, the same algorithms frequently violate our privacy and are biased and discriminatory. In their book The Ethical Algorithm: The Science of Socially Aware Algorithm Design, Michael Kearns and Aaron Roth, professors at Penn Engineering, suggest that the solution is to embed precise definitions of fairness, accuracy, transparency, and ethics at the algorithm’s design stage. They say algorithms don’t have a moral character. It is we who need to learn how to specify what we want. In a conversation with Knowledge@Wharton, Kearns and Roth – who are the founding co-director and a faculty affiliate respectively of the Warren Center for Network and Data Sciences — discuss developments in the field.

Focus: AI Ethics/Policy
Source: Knowledge@Wharton
Redability: Intermediate
Type: Podcast
Open Source: No
Keywords: N/A
Learn Tags: Bias Data Collection/Data Set Design/Methods Ethics Fairness
Summary: While algorithms often make our lives more efficient, the same algorithms frequently violate our privacy and are biased and discriminatory. In the book _The Ethical Algorithm,_ the authors suggest that the solution is to embed precise definitions of fairness, accuracy, transparency and ethics at the algorithm’s design stage.