Rethinking Explainable Machines: The GDPR’s “Right to Explanation” Debate and the Rise of Algorithmic Audits in Enterprise
The public debate surrounding the General Data Protection Regulation’s (GDPR) “right to explanation” has sparked a global conversation of profound social and economic significance. But from a practical perspective, the debate’s participants have gotten ahead of themselves. In their search for a revo...
Gespeichert in:
Veröffentlicht in: | Berkeley technology law journal 2019-04, Vol.34 (1), p.143-188 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The public debate surrounding the General Data Protection Regulation’s (GDPR) “right to explanation” has sparked a global conversation of profound social and economic significance. But from a practical perspective, the debate’s participants have gotten ahead of themselves. In their search for a revolutionary new data protection within the provisions of a single chapter of the GDPR, many prominent contributors to the debate have lost sight of the most revolutionary change ushered in by the Regulation: the sweeping new enforcement powers given to European data protection authorities (DPAs) by Chapters 6 and 8 of the Regulation. Unlike the 1995 Data Protection Directive that it replaced, the GDPR’s potent new investigatory, advisory, corrective, and punitive powers granted by Chapters 6 and 8 render DPAs de facto interpretive authorities of the Regulation’s controversial “right to explanation.” Now that the DPAs responsible for enforcing the right have officially weighed in, this Article argues that at least one matter of fierce public debate can be laid to rest. The GDPR provides a muscular “right to explanation” with sweeping legal implications for the design, prototyping, field testing, and deployment of automated data processing systems. The protections enshrined within the right may not mandate transparency in the form of a complete individualized explanation. But a holistic understanding of the interpretation by DPAs reveals that the right’s true power derives from its synergistic effects when combined with the algorithmic auditing and “data protection by design” methodologies codified by the Regulation’s subsequent chapters. Accordingly, this Article predicts that algorithmic auditing and “data protection by design” practices will likely become the new gold standard for enterprises deploying machine learning systems both inside and outside of the European Union. |
---|---|
ISSN: | 1086-3818 2380-4742 |
DOI: | 10.15779/Z38M32N986 |