Skip to main content

Understanding of forensic science is poor in UK criminal justice system

The level of understanding of forensic science among lawyers, judges, and juries is poor, according to evidence submitted to parliament by a group of researchers from Queen Mary University of London.

Published on:

The researchers suggest that forensic science is contributing to injustices because of misunderstandings about matching trace evidence to a particular person.

The group have submitted evidence to a House of Lords Science and Technology Committee inquiry into Forensic Science

The inquiry was set up to explore the role of forensic science within the UK Criminal Justice System in light of concerns over the weaknesses of current forensic methods in the delivery of justice. 

The researchers involved include Professor Norman Fenton (School of Electronic Engineering and Computer Science), Dr Primoz Skraba (School of Mathematical Sciences), Amber Marks (School of Law), and Dr Ian Walden (Centre for Commercial Law Studies).

Errors can and do occur

When asked about the level of understanding of forensic science within the criminal justice system amongst lawyers, judges and juries, Professor Fenton believes that there needs to be much greater awareness that all evidence is subject to potential errors.

He noted: “Errors can and do occur at every level of evidence evaluation: sampling, measurement, interpretation of results, and presentation of findings. Forensic scientists should articulate, and attempt to quantify, all such possible sources of error. And legal professionals should understand and expect this information, and probe for possible sources of uncertainty when it is not presented by the experts.”

A match is not an identification

Professor Fenton also believes that injustices are occurring widely because of misunderstandings about the probative value of forensic match evidence.

He advised: “Because many forensic traces from crime scenes are only ‘partial’ and may be subject to various types of contamination, the resulting ‘profile’ is not sufficient to ‘identify’ the person; many people would have a partial profile that matches.

“I have been involved in cases where such assertions have a dramatic impact on the judge and the jury, while even defence lawyers assume their case is impossible to defend. But to interpret this as ‘proof’ that the defendant must have been at the crime scene may be to grossly exaggerate the probative value of the evidence in favour of the prosecution case.”

Furthermore, Fenton argues that the meaning of the word “match” in the context of forensic evidence needs to be re-evaluated. Currently, “a match” between two pieces of evidence is understood to mean that they come from the same source but two pieces of evidence are branded “a match” when their measured characteristics are the same (within an agreed tolerance).

Lawyers and judiciary need training

The committee was also advised that lawyers and the judiciary should receive basic training in probability and statistics because the current training available is ‘suboptimal’.

This would enable them to understand the statistical analyses presented, to identify any weaknesses in the analyses presented, and to avoid common fallacies such as the prosecutor’s fallacy. In forensic investigations “there is virtually always some degree of uncertainty,” Professor Fenton added.

Worrying use of data

Elsewhere in the submission, Dr Primoz Skraba highlights an emerging gap is the increasing use of demographic and personal data by companies to identify individuals, which is likely to be used in forensic science in the future.

According to Skraba: “While a company’s misidentification may result in a misplaced advertisement, the consequences in forensic science may be more severe.”

This is not limited to use by private companies; forensic technologies are being used now by agencies such as the Metropolitan Police through its Gangs Matrix, which has raised concerns around the legitimacy of using predictive tools in criminal justice.

Amber Marks notes: “Risk scores generated by police algorithms are shared with multiple agencies and this results in often stigmatic and punitive repercussions for the individual involved, including in policing, educational and medical settings, decisions on benefits and housing entitlements and deportation proceedings, while obviating the procedural safeguards of the criminal trial.”

Dr Skraba also highlighted the gaps that currently exist in the understanding of forensic statistics within the justice system, where conclusions must be drawn from individual measurements rather than repeated experiments.

More information:

  • The full submission can be read on the House of Lords Website here
  • Professor Fenton and Dr Skraba were both recently appointed Turing Fellows at the Alan Turing Institute. The Alan Turing Institute is the UK’s national institute for data science while Turing Fellows are senior academics who will spend a portion of their time at the Institute pursuing research in data science and artificial intelligence.
  • Find out more about studying Law or Mathematics and Statistics here
Back to top