Article Summary

VISIT AI Trust Library
Article Summary of:

Facebook civil rights audit urges ‘mandatory’ algorithmic bias detection

July 8, 2020

An independent audit of Facebook's use of artificial intelligence found a dangerous lack of controls and limited reach across the teams that take advantage of AI. Civil rights lawyers Laura Murray and Megan Cacace, along with firm Relman Colfax, determined that the social media giant's attention to algorithmic bias is laudatory but also far too nascent for an organization with so much influence on people's lives and livelihoods.

In the Auditor Observations section of the published report, the authors were critical of the limited reach of Facebook's Fairness Flow and Responsible AI initiatives, believing these should be mandatory, not voluntary as they are today. Perhaps more importantly, they noted the difficulty they faced in understanding the effectiveness of these programs since "the Auditors have not had full access to the full details of these programs". Providing visibility into these applications for non-technical users is required for a proper audit function for ML and AI systems.

Ethics & Responsibility