Procedural and Institutional Backing of Transparency in Algorithmic Processing of Rights

Vol.13,No.2(2019)

Abstract

Efficient enforcement of legal substance requires proper procedures and capable institutions. In that respect, law is now being challenged by the emergence of automated systems that autonomously decide about matters concerning rights. The neuralgic point in enforcement of legal compliance of such systems, namely with regards to possible discrimination, is transparency. Currently, there exists, at least in the EU, particular individual right to know the logic of respective algorithms. The comment tries to narrow down the issue of actual enforceability of that right by investigating its basic procedural and institutional aspects.


Keywords:
Algorithmic State; Automated Decisions; Logic of Algorithms; Transparency of Algorithms

Pages:
p. 401–414
References

[1] Baker, J. J. (2018) Beyond the Information Age: The Duty of Technology Competence in the Algorithmic Society. South Carolina Law Review, 69.

[2] Bodo, B. et al. (2017) Tackling the Algorithmic Control Crisis – The Technical, Legal, and Ethical Challenges of Research into Algorithmic Agents. Yale Journal of Law & Technology, 19.

[3] Boehm, F. and Cole, M. D. (2014) Data Retention after the Judgement of the Court of Justice of the European Union. EP Greens/EFA Group. Available online from: http://orbilu.uni.lu

[4] Bose, U. (2015) The Black Box Solution of Autonomous Liability. Washington University Law Review, 92.

[5] Coglianese, C. (2017) The Limits of Performance-Based Regulation. University of Michigan Journal of Law Reform, 50 (3).

[6] Coveney, P. and Highfield, R. (1996) Frontiers of Complexity. Penguin Random House.

[7] De Lasser, M. (2009) Judicial deliberations: a comparative analysis of transparency and legitimacy. Oxford University Press.

[8] Edel, F. (2007) The length of civil and criminal proceedings in the case-law of the European Court of Human Rights. Strasbourg: Council of Europe Publishing.

[9] Fuller, L. (1969) The Morality of Law. Yale University Press.

[10] Gurumurthy, A. and Bharthur, D. (2018) Democracy and the Algorithmic Turn. Sur – International Journal on Human Rights, 27.

[11] Healey, J. (2011) The spectrum of National Responsibility for Cyberattacks. The Brown Journal of World Affairs, 18 (1).

[12] Kleeman, S. (2016) Here Are the Microsoft Twitter Bot’s Craziest Racist Rants. gizmodo.com, 24 March. [online] Available from: https://gizmodo.com/here-are-themicrosoft-twitter-bot-s-craziest-racist-ra-1766820160 [Accessed 5 September 2019].

[13] Kravets, D. (2012) An Intentional Mistake: The Anatomy of Google’s Wi-Fi Sniffing Debacle. Wired, 2 May. [online] Available from: https://www.wired.com/2012/05/googlewifi-fcc-investigation/ [Accessed 5 September 2019].

[14] Kuner, C. Bygrave, L. and Docksey, C. (2019) The EU General Data Protection Regulation – A Commentary. Oxford University Press, forthcoming.

[15] Lehr, D. and Ohm, P. (2017) Playing with the Data: What Legal Scholars Should Learn about Machine Learning. University of California, Davis, Law Review, 51.

[16] Lévy, P. (2002) Becoming Virtual – Reality in the Digital Age. Plenum Trade.

Metrics

611

Views

210

PDF views