Professor of Privacy and Information Law
Most of us use cloud computing without thinking twice. When you check your messages in Gmail or Outlook, attend a meeting on Teams or Zoom, use social media like Instagram or Twitter, all of these activities are possible because cloud services are there in the background. Widespread dependence on computing ‘as-a-service’ raises questions about who can access our data, and what happens if a service goes down.
Queen Mary’s Cloud Legal Project has been influential in demystifying cloud service arrangements, and in ensuring that legal frameworks provide appropriate safeguards.
The Cloud Legal Project (CLP), launched in 2009 by Queen Mary Professors Millard, Walden, and Reed, was the first legal research project to focus specifically on cloud computing. Over the years, the CLP has received generous financial support from Microsoft and the European Commission, among others.
The COVID-19 pandemic sparked an acceleration in the adoption of cloud computing as billions of workers, students, and consumers switched to working, studying, shopping, and socialising online. It’s a move that seems permanent and it was only possible because of cloud technologies and services.
Unlike our own devices, such as smartphones, tablets, and laptops, the computer servers that make up ‘the cloud’ are usually located at remote data centres operated by major cloud service providers. Companies like Amazon, Microsoft, and Google offer cloud computing as a utility service. This is a major shift in how computer systems are used.
Most people who use Apple’s iCloud to store their photos probably don’t know that their data might actually be stored in facilities operated by other cloud providers such as Amazon Web Services (AWS) and Google Cloud.
Cloud supply chains can be complex, with providers of important cloud services often relying on other cloud providers to deliver the services we use. For example, your Zoom meetings and recordings might be hosted by Amazon Web Services (AWS) or Oracle. Such ‘layered arrangements’ are extremely common, and there is often a lack of transparency as to where, and by whom, data are being processed. Indeed, most people who use Apple’s iCloud to store their photos probably don’t know that their data might actually be stored in facilities operated by other cloud providers such as AWS and Google Cloud.
In 2010 the CLP was the first research team to publish a large-scale analysis of the standard form cloud contracts which people typically ‘click to accept’ without reading. The research showed how one-sided such contracts could be, and that many contained provisions that appeared to be unfair, or even unlawful.
The team then conducted interviews with service providers, major cloud customers (such as banks and government agencies), and others involved in cloud deals to determine whether cloud providers were prepared to negotiate their contract terms. In 2012 we revealed for the first time the extent to which major technology companies were in fact willing to work with customers to address concerns about safe and lawful use of cloud services.
The CLP’s comparative research into cloud contracts shaped and informed the EU Digital Content and Services Directive, a consumer protection law that protects consumers from unfair cloud terms across the EU.
At a global level, the United Nations Commission on International Trade Law (UNCITRAL) asked the CLP to write a ‘scoping paper’ on cloud computing contracts which identified key issues for businesses and governments. Professor Walden was then invited to join an expert group drafting a checklist for cloud contracts. Both the scoping paper and the checklist were based on CLP research.
Artificial intelligence (AI) technologies are increasingly being used in automated decision-making systems, for example, to filter job applications and prioritise access to healthcare.
CLP research into AI and the cloud focused on the problem of AI decision-making substituting for human decisions, and how responsibility (and therefore liability) and transparency could be promoted. The team concluded that a general regulatory framework would be undesirable as it would stifle innovation. However, while existing liability laws can regulate low-risk AI decision-making, high-risk AI applications might need technology-specific regulation. Professor Reed presented this research to the UK House of Lords Select Committee on Artificial Intelligence which followed the CLP recommendations in its report.
Law enforcement agencies face challenges in investigating and prosecuting cases that involve cloud data. For example, the material might be stored by a foreign cloud service provider and be out of reach of national law enforcement agencies.
Professor Walden was invited by the European Commission’s Directorate-General for Migration and Home Affairs to help draft legislative instruments to improve criminal investigations in cyberspace. His draft, which was based on key research findings by the CLP, was praised for providing “the first coherent picture of what a legislative instrument in this area could look like.”
Digital assets - such as social media accounts, emails, crypto-assets and photographs - are an increasingly important part of people’s lives. But what happens to these assets when someone dies or is incapacitated? Are families able to access them? Professor Millard and CLP Researcher David Michels worked with the Society of Trust and Estate Practitioners to collect survey data from over 500 estate practitioners worldwide. The survey asked practitioners about their experiences with clients' digital assets. Many had encountered difficulties in obtaining access to digital assets on death or incapacity. Challenges included a lack of clarity regarding property rights and a lack of cooperation from cloud service providers.
The Law Commission of England and Wales is currently reviewing the legal status of digital assets and the CLP has responded to a call for evidence with recommendations on key areas of law which require clarification.