Chara Bakalis - December 2017

  • Regulating Hate in the Digital Age


    The concept of ‘hate crime’ will be familiar to many, particularly since the Brexit referendum when the spike in hate crimes was widely reported in the press. However, it is becoming increasingly obvious that hate crimes are also being perpetrated over the internet. It is difficult to find any precise figures given that it is believed that only about 9% of online hate crimes are investigated. One survey suggests that as many as 68% of all hate crime incidents occur online. Similarly, TellMAMA (an organisation which measures anti-Muslim hate) found that 2/3 of the Islamophobic incidents reported to them occurred online. The police certainly believe the scale of abuse to be incredibly high and they have voiced their concerns about their ability to control this behaviour. These statistics and developments have not gone unnoticed, and increasing pressure on the government to do something about cyberhate is coming to bear.

    But why should any of this be of interest to those concerned with equality and diversity?

    It is becoming increasingly clear that, whilst anyone can become a victim of cyberhate, it seems that certain groups appear to be attracting more online hate than others. For instance, the cross-party think-tank, Demos, published a report which showed that in a 3 week period last year, 6,500 unique users were targeted by 10,000 explicitly aggressive and misogynistic tweets. A UN Report found that 73% of women and girls have suffered some form of cyberviolence. The Guardian newspaper undertook its own research and discovered that of its 10 most abused writers, 8 were women, and the remaining two were black men, one of whom was gay. A survey by Galop suggests that 84% of LGBT+ people have experienced at least one occurrence of online abuse and the UK Safer Internet Centre published a report that found that one in four children have suffered abuse online because of disability, race, sexual orientation, transgender identity or religion. It appears, therefore, that online hate is not distributed equally in society, and so this should be of concern to anyone interested in fostering equality and inclusion.

    My own interest in this area stems from my perspective as a hate crime lawyer, and my contention is that all legislation pertaining to hate crime is at heart about promoting equality and diversity in a multicultural society. My vision of hate legislation is to view it as running in parallel to civil equality laws such as the Equality Act 2010: if we are to ensure that all citizens are able to enjoy civil life equally, then the criminal law also needs to play its part through hate crime legislation.

    Viewing hate crime in this way has two important implications for those with a broad interest in equality and diversity.

    Firstly, it is important when it comes to deciding which characteristics should be included as part of hate crime legislation. Currently, only race, religion, sexual orientation, disability and transgender identity are covered (to varying degrees) by hate legislation. I argue, however, that all the other characteristics covered by the equality legislation should be considered for inclusion.

    This is particularly relevant to cyberhate, given the figures cited above show the prevalence of misogynistic hate online. And yet gender is not currently one of the protected characteristics under hate crime legislation. This means that attacks on women are not recognised as ‘hate crimes’ by the law. As a result, they are less likely to be prosecuted as the Crown Prosecution Service's guidelines prioritise the prosecution of hate crimes on social media. Perpetrators are also likely to receive a more lenient sentence as they do not come under harsher penalties stipulated by hate crime laws.

    Secondly, another reason this should be of interest to those committed to equality and diversity is because the harm caused by online hate against particular groups can affect their ability to enjoy the same civil rights as the rest of society. In her books on hate crimes in cyberspace, Danielle Keats Citron has shown how online hate can destroy relationships, and have a devastating impact on careers. It can also have an adverse impact on individuals’ ability to exercise their free speech as they are likely to encounter a greater level of hate when they do express their opinions online or have an internet presence. In addition to this, Jeremy Waldron, a well-known legal philosopher, has argued that the harm in hate speech lies in the way in which it poisons the atmosphere, and means that certain individuals find it harder to enjoy their civil rights. Waldron’s book focuses very much on offline hate speech, but the prevalence of online hate speech simply makes his arguments even more pertinent to the internet.

    Our commitment to diversity needs to extend to the internet given its increasing presence in our lives. We must treat it as we would any other area of life. Danielle Citron Keats likens it to sexism in the workplace: whilst there’s obviously still much to do be done in relation to workplace sexual harassment, we have come a long way in the last fifty years. This required a change in people’s perception about which behaviours would be tolerated in the workplace. We need a similar shift in our cultural acceptance of appropriate online behaviour.

    A vital part of this will be the existence of laws that enable the police and prosecutors to target perpetrators of online hate. My own work has focussed on trying to create an appropriate legislative framework that properly captures the harm caused by online hate, but which is also workable from a policing perspective. My next project is to consider the role that third party intermediaries such as Facebook and Twitter should play in the regulation of online hate. There is still a long way to go, and the challenges are great, but this is an important issue that goes to the heart of our attempt to ensure that the internet, and particularly social media, functions to support rather than undermine the values of society.

    Chara Bakalis is a Principal Lecturer at Oxford Brookes University. Her research interests lie in the area of hate crime and cyberhate.