Predicting Online Radicalisation

Predicting Online Radicalisation

By Dr Laura G. E. Smith (University of Bath), Dr Timothy Cribbin (Brunel University London) and Prof Julie Barnett (University of Bath) with Defence Science & Technology Laboratory (dstl)

Supporters of so-called Islamic State (IS) use social media to communicate radical Islamist messages and recruit people to ‘the Caliphate’. Currently, there are an estimated 46,000 active ‘unofficial’ Twitter accounts used by IS supporters. One in five of these accounts use English as their primary language, and each account has an average of 1,000 followers. This use of online communications in English by supporters of IS presents two unprecedented research opportunities: first, to predict the spread of radical Islamism in English-speaking populations; and second, to understand how social media communications shape people’s perceptions of IS, and of the conflict(s) in Syria and the Levant.

The central aim of this project is to develop a conceptually-grounded algorithm that can be combined with social media analytics software to predict radicalisation of mainstream users. Our research questions are, how and why do people develop identification with radical Islam online over time?

Our challenge in addressing these issues is that there are currently no big data analytics systems that can enable three necessary and significant sets of analyses:

  • Longitudinal analysis (meaningful change in an individual’s posts over time)
  • Qualitative analysis of the narrative content of a large volume of posts
  • Prediction of the emergence of new online psychological groups or expansion of existing groups.

The absence of this functionality means that current systems cannot adequately analyse the ongoing and developing debates, conversations and opinions in big (social media) data in a way that makes this data amenable to our purposes of understanding how and why radicalisation can occur online.

To address this challenge, we will develop a software tool with help from stakeholders at Defence Science and Technology Laboratory (dstl) that is technically compatible with the IT systems of government agencies, and that can ethically harvest and rigorously analyse a large volume of publicly available social media data to provide conceptually-informed warning of security threat.

To do this, we will build upon the capabilities of an existing software tool, Chorus, and integrate it with a new conceptual framework from Psychology to explore the usefulness of a novel analysis of longitudinal qualitative big data. By doing so, we aim to identify novel variables derived from online language that can explain a significant amount of variance in the development of extremism. The project will thus provide initial tests of the new software capabilities, and proof of concept for a ‘radicalisation algorithm’.

The research will impact on UK efforts to counter extremism by developing the Chorus software and radicalisation algorithm in collaboration with dstl, and engaging government agencies in discussions about the tools. It will also benefit strategic communicators by demonstrating practical ways to communicate and target audiences to maximise effectiveness of risk-mitigating messages. These contributions can in turn be expected to benefit the general public in the longer term by greater and more effective public engagement in understanding the risks associated with online socialisation. Improved understanding of online socialisation and radicalisation will help social media users navigate the psychological online environment more safely. Finally, we expect our project to have significant capacity-building benefits in the new and emerging field of social media / big data analytics.

Predicting Online Radicalisation’ is a project by Laura Smith (PI), Timothy Cribbin (Co-I), and Julie Barnett (Co-I).

Dr Laura G. E. Smith is a social and organizational psychologist, and a Lecturer in Social Psychology at the University of Bath. She has particular interest and expertise in socialisation and radicalisation processes, including the development of identification and trust, and the implications for behaviour. Her work involves social media data analysis, longitudinal fieldwork with industry partners and laboratory experiments. She has received funding for her research from the Economic and Social Research Council (ESRC), Arts & Humanities Research Council (AHRC), Engineering & Physical Sciences Research Council (EPSRC), the Leverhulme Trust, and the Australian Research Council (ARC).

Dr Timothy Cribbin is an information scientist with interests and expertise in information visualization, text analytics and search user interfaces with a particular interest in the application of visual analytics to social (and more broadly academic) research. He has been a lecturer in the Department of Information Systems and Computing since 2001 and has taught broadly across subjects including information visualization, human-computer interaction and programming. He has published 26 peer-reviewed articles in the areas of information visualisation, visual analytics and human-computer interaction and regularly reviews for journals such as Information Visualization, JASIST and Information Processing and Management.

Prof Julie Barnett is a social and health psychologist with particular interest and expertise in risk appreciation, the development of health technologies, the maintenance and change of behavior, new forms of data, public engagement processes and policy development. Over the last ten years, she has been part of a range of largely interdisciplinary projects funded by the EPSRC, ESRC, the European Union, the Department of Health, the Health and Safety Executive, the Ministry of Defence, the Environment Agency, the Food Standards Agency, and the Wellcome Trust.