Posted on February 14, 2014

Study Finds 10,000 Slurs a Day Posted on Twitter

Mark Prigg, Daily Mail (London), February 13, 2014

10,000 racist tweets are sent ever day, a major new study into racism on social networks has discovered.

Research by the think tank Demos found the biggest insult was ‘white boy’.

Researchers analysed 126,975 English-language tweets from across the globe over a 9-day period in the biggest ever study of Twitter racism.

Researchers revealed the most common racial slurs used on the micro-blogging site included ‘whitey’ and ‘pikey’.

However, as many as 70% of tweets using such language were deemed to be using slurs in non-derogatory fashion.

‘This sparks the debate about the extent to which Twitter truly is a platform for racism and abusive language,’ the report claims.

Jamie Bartlett, Director of CASM at Demos and author of the report, said: ’Twitter provides us with a remarkable window into how people talk, argue, debate, and discuss issues of the day.

‘While there are a lot of racial slurs being used on Twitter, the overwhelming majority of them are not used in an obviously prejudicial or hateful way.

‘This study shows just how difficult it is to know what people really mean on the basis of a tweet.

‘Context is king, and often it’s more or less lost on Twitter.’

Of the 126,975 English-language tweets from across the globe, further analysis suggests only 1% of tweets used a racial slur in an ideological context within a political statement or in a call to action in the real world.

Analysis found that as few as 500 tweets a day were directed at an individual and appeared on first sight to be abusive.

‘There were very few cases that presented an imminent threat of violence, or where individuals directly or indirectly incited offline violent action,’ the report’s authors said.

‘We estimate that, at the very most, fewer than 100 tweets are sent each day which might be interpreted as threatening any kind of violence or offline action.’

A Twitter spokesperson told MailOnline ’It’s important to know that Twitter does not screen content or remove potentially offensive content.’

‘Our Twitter Rules outline content boundaries on the platform.

‘Targeted abuse and specific threats of violence are violations of our rules, and users can report this type of content from within the Twitter application or at this link on our website.

‘Twitter is a social broadcast network that enables people and organizations to publicly share brief messages instantly around the world. This brings a variety of users with different voices, ideas and perspectives.

‘Users are allowed to post content, including potentially inflammatory content, as long as they’re not violating the Twitter Rules.’

The Anti-Social Media report estimates between 50-70% of tweets were used to express in-group solidarity with ‘re-claimed’ slurs used within ethnic groups.

It cites ‘P**i’ as one term becoming appropriated by users identifying themselves of Pakistani descent.

Last year also saw much debate over use of the term ‘Yid army’ by supporters of Tottenham Hotspur, a British football club with a strong historical connection to London’s Jewish community, to describe themselves.

In December, UK Labour MP Jack Dromey also caused uproar by referring to his postman as ‘Pikey’ in a tweet.

Dromey responded to criticism by explaining that the nickname derived from Corporal Pike, a character in TV show Dad’s Army, demonstrating the potential for racial slurs deemed offensive to be intended non-offensively.