When algorithms discriminate


The US criminal justice system prohibits discrimination. However, the very same system is discriminating on a grand scale by using algorithms in sentencing. In Denmark, we also see examples of statistical discrimination, says Professor Kasper Lippert-Rasmussen. He heads a new basic research centre at Aarhus BSS, Aarhus University, which focuses on the subject of discrimination.

21.08.2018 | INGRID FOSSUM

In May, the video of a white American police officer brutally pinning the African American George Floyd to the ground by pressing one knee to his neck with Floyd gasping “I can’t breathe” spread like wildfire on social media. The incident put the human rights movement Black Lives Matter and their struggle for black people’s rights on the map internationally once and for all.

The US has a long history of racism and discrimination going back to the time of slavery. For years, cases of police violence and brutality with seemingly racist motives have regularly appeared in the media. However, another kind of discrimination also exists. One which is inherent in the structures of society and in people’s unconscious values, habits and behavioural patterns.

“This doesn’t necessarily mean that people have a negative attitude towards African Americans, but simply that our institutions and practices are structured in a way that favour white people at the expense of blacks.” These are the words of Kasper Lippert-Rasmussen, who heads the new basic research centre exploring the topic of discrimination based at Aarhus BSS, Aarhus University.

For years, he has studied discrimination and has published a number of scientific articles on statistical discrimination in recognised journals. His latest publication was a contribution to the book ”Principled Sentencing and Artificial Intelligence” edited by Jesper Ryberg and Julian Roberts soon to be published by Oxford University Press.

“Statistical discrimination may also undermine our social status as equals and our right to be treated a individuals."

Kasper Lippert-Rasmussen, Professor, Department of Political Science, Aarhus BSS

Algorithms determine sentences

The American criminal justice system is an example of how this systemic statistical discrimination occurs. Typically, sentences are determined based on how dangerous the defendant is perceived to be and how likely he or she is to reoffend. However, previously judges would tend to discriminate unconsciously and would thus impose tougher sentences on black people than whites. To prevent this kind of bias, computer algorithms were introduced and are now used to assess a defendant’s likelihood of reoffending. The reasoning is that algorithms are more neutral than the human assessment.

The algorithms are fed with data on the defendant’s place of residence, level of education, level of income, parental criminal history, employment situation and previous convictions, if any. It is not allowed to feed the algorithms with data on race. However, as the other variables are already strongly associated with race, race will still affect the sentencing. The reason is that African Americans are statistically more likely to be unemployed and previously convicted than white Americans and that these are two factors are key when assessing the risk of recidivism (i.e. the tendency of a convicted criminal to reoffend).

The algorithms are very accurate when assessing the risk of recidivism. However, according to the algorithms, blacks are still more likely to reoffend. This means that if a white and a black person have committed the same crime, the algorithms will perceive the black person to be more dangerous and will thus impose a tougher sentence on that person. Is this fair? This has been a much-debated question and it is also one of the many points of criticism raised by the Black Lives Matter movement on behalf of the black population,” says Kasper Lippert-Rasmussen.

Unfair consequences for blacks

For years, American civil rights movements such as Black Lives Matter have wanted to break with what they see as white supremacy, racist violence and discrimination against blacks. Discrimination against black people in the American criminal justice system is unintentional and is thus defined as indirect discrimination. However, if you are aware that the use of algorithms has unfair consequences for black defendants and if you still support the use, we might be dealing with direct discrimination, says Lippert-Rasmussen.

“Algorithms are associated with stigmatisation and inferiority. And they do not do anything good for African Americans, who already see themselves as profiled by the police and as second-rate citizens. You are unlucky if you are placed in the social group to which the algorithms believe you belong even though you as an individual might not be dangerous and are unlikely to reoffend. In this way, you are treated as a statistical number rather than as an individual,” says Lippert-Rasmussen

On the other hand, algorithms are more precise than psychology sessions and tests - methods that were previously used to assess criminals. Thus, algorithms are also a cheaper solution, which actually also protects citizens against crime.

“In the US, a lot of crime takes place between people belonging to the same group. In other words, blacks will commit crimes against other blacks and whites against other whites. In this way, imposing long sentences on black criminals might actually benefit other black people in society, as it keeps them away from the community. One reason for this is that there are many indications that punishment actually increases crime rather than reduces it,” says Lippert-Rasmussen.

Undermining the individual

In the short term, the use of algorithms will thus have a positive effect on crime rates, but in the long term it will have a detrimental affect on black/white relations. Many also believe that it is unfair that the risk of recidivism weighs so high in relation to the crime itself.

"It's very difficult to imagine a world without statistical discrimination. Regardless of whether or not it is objectionable,” says Lippert-Rasmussen.

For example, it is probably not a problem that the police spend more resources on preventing violent crime among young men than among elderly women. However, it is more problematic if employers prefer male applicants for the sole reason that, statistically, women take longer parental leave. It would also be unthinkable that the insurance businesses refrained from developing risk profiles for their customers. These are also based on statistics and lead to younger men having higher premiums on their car insurance and elderly people on their life insurance.

“However, statistical discrimination may also undermine our social status as equals and our right to be treated a individuals,” Lippert-Rasmussen points out.

Facts:

Academic publications: