Skip to main content
School of Law

A conversation with Daragh Murray

Daragh Murray is a new Institute of Humanities and Social Sciences Fellow at Queen Mary. He talks to us about himself, and his research.

Published:

Could you tell us a bit about yourself and your academic background?

I have a bit of a complicated academic background. I started as a computer scientist, and did a Masters in Computer Security and Forensics before eventually transitioning to human rights law, and completing a LLM at the Irish Centre for Human Rights, and a PhD at the University of Essex Human Rights Centre. I have a particular interest in understanding how human rights law applies in the real world, and how it can be used to effect change. My initial research focused on the relationship between human rights law and the law of armed conflict, and was aimed at understanding how human rights apply on the battlefield, and on developing human rights standards applicable to non-State armed groups. Since then I have returned – tentatively – to computer science, and I now work on understanding the human rights impacts of new technologies and AI, and how human rights law can be used to inform, or regulate, the design, development, and deployment of new technologies. A few years ago I co-authored an independent report on the Metropolitan Police’s use of live facial recognition technology.

I also have an interest in using new technologies to investigate potential human rights violations. In 2016 I set up a Digital Verification Unit at Essex, where we train students to conduct investigations using open source tools, such as user generated videos and photos, satellite imagery, posts on social media, etc. The Unit has partnered with Amnesty International and numerous UN Commissions of Inquiry, and in 2019 we won the Times Higher Education Award for Best International Collaboration for our work with Amnesty and other universities investigating the coalition attacks on Raqqa.

What are you planning to work on in the next few years? How does this relate to your past work?

I was incredibly lucky to have been awarded a UKRI Future Leaders Fellowship to research ‘What does Artificial Intelligence Mean for the Future of Democratic Society?’.  This interdisciplinary project runs until 2026, and is centred around two key questions. First, how can human rights law effectively inform the regulation of new technologies, particularly in the pre-deployment phase. Second, and maybe most interestingly, what will the impact of AI be on individuals and society. Specifically, if we are subject to pervasive surveillance, and if the products of this surveillance are used to make consequential decisions about us (whether we are stopped by the police, whether we are suitable for a mortgage, whether we qualify for medical treatment…), will we change our behaviour? This change in behaviour is referred to as a ‘chilling effect’, and it can affect the process by which individuals develop their identity and personality, and democratic participation. The really interesting – and worrying – element is that previously only small groups of people have been subject to surveillance, this is the first time in human history that all of society can be surveilled, all the time. Both of these questions are interconnected – we cannot effectively regulate technology, if we don’t understand the potential utility and the potential harm of that technology.

The project combines human rights law, sociology and philosophy. I will be recruiting some post-doctoral researchers soon, and have just completed over 150 interviews with individuals subject to traditional surveillance – politicians, journalists, activists, sex workers – to understand what the impact surveillance has had on their lives and their ability to organise. I hope to develop these interviews, and to use them as a base to understand the potential impacts of pervasive digital surveillance.  At the moment I primarily focus on law enforcement, military and intelligence agency usages of new technologies, but I hope to begin working on corporate uses in the next few years.

 

 

Back to top