• Ingen resultater fundet

View of ALGORITHMIC LEGAL METRICS

N/A
N/A
Info
Hent
Protected

Academic year: 2022

Del "View of ALGORITHMIC LEGAL METRICS"

Copied!
4
0
0

Indlæser.... (se fuldtekst nu)

Hele teksten

(1)

Selected Papers of #AoIR2020:

The 21st Annual Conference of the Association of Internet Researchers Virtual Event / 27-31 October 2020

Suggested Citation (APA): Burk, D. (2020, October). Algorithmic Legal Metrics. Paper presented at AoIR 2020: The 21th Annual Conference of the Association of Internet Researchers. Virtual Event: AoIR.

Retrieved from http://spir.aoir.org.

ALGORITHMIC LEGAL METRICS

Dan L. Burk

University of California, Irvine

Introduction

Automated pattern analysis and decision-making, colloquially designated as artificial intelligence or AI, is increasingly being deployed to mediate or to assist in social determinations across a range of domains including governance and regulatory decisions. (1) Predictive algorithms have been deployed to identify families at risk of abusive behavior, in order to mobilize social services intervention before actual harm occurs. Predictive algorithms have been relied upon to assess the threat of criminal recidivism, and so determine the allowance for bail or for prisoner parole. Predictive algorithms are being incorporated into policing strategies, allowing law enforcement resources to be positioned where criminal activity is anticipated to occur. And algorithmic predictions are becoming progressively arrayed across a broad swath of other legal and social decision-making: to allocate public assistance, to pre-empt customs and border violations,to determine immigration status, to forecast threats to national security. (2, 3)

Emerging proposals suggest an even greater role for algorithmically determined legal metrics. Specifically, recent scholarship has suggested that the collection of detailed information on consumers, together with algorithmic processing of such data, will allow for customized tailoring of legal imperatives to the capacity or characteristics of

individual actors. (4,5) This body of work argues that legal directives could be matched to detailed consumer profiles so as to create metrics that are personalized for the profiled individual, rather than uniform for the general populace. Proposals of this sort have been circulated for a variety of legal regimes, including contract, tort, trusts and estates, criminal law, and copyright.

(2)

Relying as they do on mechanisms of consumer surveillance, these proposals are effectively intended to translate the mass personalization of market services and institutions to the provision of legal services and institutions. (6) Although such proposals for personalized legal metrics carry a degree of superficial plausibility, on closer inspection it becomes clear that they entail a breathtaking degree of naivete regarding the social infrastructure on which such classifications depend. An increasingly robust sociological literature demonstrates that algorithmic scoring re- creates and re-enforces existing social orders, accelerating some of the most problematic mechanisms for exploitation and inequality. (7,8) Such metrics not only amplify and reinforce existing social biases, but tend to produce detrimental self- surveillance. (9) Due to such effects, the quantified assessments supplied by

algorithmic scoring are not neutral, but take on normative and moral connotations. (10) Legal determinations such as tort liability or criminal culpability that carry their own moral weight are likely to produce unintended consequences when associated with morally charged algorithmic metrics. A close examination of these mechanisms quickly illuminates disjunctions at the intersection among jurisprudence, automated

technologies, and socially reflexive practices, and alerts us to areas of concern as legal institutions are increasingly amalgamated into the growing algorithmic assemblage.

Consequently, in this paper, I begin to map out the intersection between the social effects of quantification and the social construction of algorithms in the context of legal decision making. In previous work, I have explored the implications of attempting to incorporate legal standards into algorithms, arguing that the social action typical of algorithmic systems promises to shape and eventually become the legal standard it seeks to implement. Here I essentially consider the inverse proposition: I explore the effects of incorporating algorithms, which is to say algorithmic metrics, into legal

standards. In particular, I examine the anticipated use of algorithmically processed Big Data in attempting to align legal incentives with social expectations.

Several previous commentators have been properly concerned about the biases endemic to data profiling, but I argue that algorithmic bias extends well beyond problems of prejudice or inaccuracy, to shape and define the social relationships and behavior experienced by its targets. The source of such distortions lies in reflexive social practices associated with algorithmic measurements, with which algorithmic processes interact in a broader structural context. (3, 11) These effects are accelerated and amplified by the speed and scale of automated data analysis and processing.

Algorithmic metrics are therefore performative, in the sense that they create their own social facts. (11, 12) Such effects are, perhaps paradoxically, heightened by

transparency of algorithmic inputs and processes, leaving in doubt the advisability of some recent scholarly calls for greater transparency in algorithmic profiling.

I link these concepts to the normative functions of law, showing how legal judgments will be distorted by the introduction of algorithmic scoring regimes, particularly those being imported from datafied business models in the private sector. I describe how the social processes on which algorithmic metrics rest lead ultimately to the characterization of such metrics as moral character judgments. When inserted into legal determinations that intrinsically require moral character judgments, we may expect the value biases

(3)

embedded algorithmic legal metrics to effectively become legal judgments. (10) The precipitation of algorithmic metrics into legal culpability poses a particular problem for American legal discourse, due to the American legal system s long fascination with the utilitarian economic analysis of law.

In tracing the characteristic arc of algorithmic metrics from profiling through legal application, this paper makes several novel contributions to the literature on law and algorithmic governance. First, it details corrosive social effects of algorithmic legal metrics that extend far beyond the concerns about accuracy that have thus far

dominated critiques of such metrics. Second, it demonstrates that traditional corrective governance mechanisms such as due process or transparency are inadequate to remedy such corrosive effects, and that some such remedies, such as transparency, may actually serve to exacerbate the worst effects of algorithmic governmentality.

Third, the paper shows that the application of algorithmic metrics to legal decisions aggravates the latent tension between equity and autonomy that is endemic in liberal institutions, undermining democratic values on a scale not previously experienced.

These findings make imperative the identification the areas most perniciously affected by such systems, so as to curtail or entirely exclude automated decision-making from such decisions.

References

(1) Monika Zalnieriute, Lyria Bennett Moses, & George Williams, The Rule of Law and Automation of Government Decision-Making, 82 MODERN L.REV. 425 (2019).

(2) Lyria Bennett Moses, Artificial Intelligence in the Courts, Legal Academia, and Legal Practice, 91AUSTRALIAN L.J. 561 (2017)

(3) Karen Yeung, Five Fears About Mass Predictive Personalization in an Age of Surveillance Capitalism, 8 INT L DATA PRIV.L. 258 (2018).

(4) Anthony Casey & Anthony Niblett, Self-Driving Laws, 66 U.TORONTO L.J. 429 (2016).

(5) Ariel Porat & Lior J. Strahilevitz, Personalizing Default Rules and Disclosure with Big Data, 112 MICH. L. REV. 1417 (2014)

(6) Julie E. Cohen, The Biopolitical Public Domain: The Legal Construction of the Surveillance Economy, 31 PHIL.&TECH. 213 (2018).

(7) Sonia Katyal, Private Accountability in the Age of Artificial Intelligence, 66 UCLA L.

Rev. 54 (2019).

(8) Solon Barocas & Andrew D. Selbst, B g Da a D a a e I ac, 104 CAL.L.REV. 671 (2016).

(4)

(9) John Cheney-Lippold, A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control, 28 THEORY,CULT.&SOC Y 164 (2011).

(10) Marion Fourcade, Ordinalization, 34 SOC.THEORY 175, 176-78 (2016).

(11) Lucas D. Introna, Algorithms, Governance, and Governmentality: On Governing Academic Writing, 41 SCI.TECH.&HUM.VALUES 17, 29-30 (2016).

(12) Adrian MacKenzie, The Production of Prediction: What Does Machine Learning Want?, 18EURO.J.CULT.STUD. 429 (2015).

Referencer

RELATEREDE DOKUMENTER

The second the possibilities and constraints of legal standards and moral norms to regulate the behaviour of drivers and the final issue is how this influence the risk of

In addition, algorithms rely on dynamic content that may change exactly due to internet users (as content generators) being presented with the results of the algorithmic

In  order  to  ground  the  analysis  of  the  legitimacy  of  contractual  governance  documents,  I   examine  the  legal  terms  and  conditions  of  fifteen

The complex practices of utilizing or resisting algorithmic power in everyday life become particularly visible in the game interaction of players of so called casual and/or social

In our study, we explored both the process of computational and algorithmic pedagogies through videogame production for racialized boys and girls in two schools, and observed

Ved at se på netværket mellem lederne af de største organisationer inden for de fem sektorer, der dominerer det danske magtnet- værk – erhvervsliv, politik, stat, fagbevægelse og

Her skal det understreges, at forældrene, om end de ofte var særdeles pressede i deres livssituation, generelt oplevede sig selv som kompetente i forhold til at håndtere deres

Her skal det understreges, at forældrene, om end de ofte var særdeles pressede i deres livssituation, generelt oplevede sig selv som kompetente i forhold til at håndtere deres