The Edmonton Police Service issued a press release declaring that, following a lengthy three-year sexual assault investigation “where no witnesses, CCTV, public tips or DNA matches were found,” it had called in the services of the Virginia-based Parabon Nanolabs to its rescue.
According to its website, Parabon is a nanotech company specializing in the development of a “new class” of “advanced forensic tools” for the “DNA Age.” Parabon provided the Edmonton Police Service with its Snapshot ® DNA Phenotyping Service, a process using unidentified DNA evidence to predict a person’s ancestry and physical appearance.
Serious doubts have been raised about the scientific validity
of Parabon’s Snapshot ® and other services, which have been virtually impossible to fully evaluate due to lack of transparency. One leading geneticist has gone as far as to describe Parabon’s DNA phenotyping as “dangerous snake oil.” Nature and the MIT Technology Review
have raised major concerns about the ethics and efficacy of Parabon’s technologies.
What happened is a symptom of a much larger problem. In an attempt to keep up with high-tech innovation, police worldwide are increasingly buying, with little oversight, controversial technologies that they are ill-equipped to evaluate.
Facing a wave of public backlash to its decision to hire Parabon, the Edmonton Police Service not only deleted its “ratioed” viral Tweet (archived here). It took down the image of Parabon’s genetic profile, acknowledged that the harm the profile could have caused to Edmonton’s Black community was not “adequately considered,” apologized, and promised to review their internal processes to ensure better tools are used.
We applaud how the Edmonton Police Service handled the situation. Apologies, let alone self-critical reflection and reform, are rare in policing and should be commended. However, what happened is a symptom of a much larger problem. In an attempt to keep up with high-tech innovation, police worldwide are increasingly buying, with little oversight, controversial technologies that they are ill-equipped to evaluate.
In our view, even if they could evaluate sophisticated tools for things like DNA phenotyping, there are two key reasons why they are not the ones that should be authorized to do so.
The first is that this is new technology, the impacts of which are poorly understood and frequently harmful. Parabon’s system uses novel techniques that go beyond the much more established and tested science of DNA matching, and involves a significant predictive machine learning component.
Similar technologies, frequently collected under the umbrella term of ‘artificial intelligence’ or ‘AI,’ are at the heart of a broader, ongoing debate about AI, biometrics, and DNA in the criminal justice system and society more broadly.
Big claims are regularly made about how technology can strengthen scientific decision-making and objectivity, as one finds on Parabon’s own website. Yet the evidence is now undeniable that these kinds of AI-driven technologies are rarely neutral or objective as promised, are regularly misused, and often lead to highly discriminatory outcomes. And once implemented, the damage they can do can be deep and lasting, particularly to the most vulnerable in our society.
In the case of artificial intelligence tools, the need for governance is slowly being recognized. A number of different governance tools are being explored, including AI policies to promote better oversight, and AI registers to clearly list all such technologies that have been implemented.
These steps are just the beginning of a much longer process towards developing rigorous democratic governance over these technologies. In Canada, for example, Bill C-27 passed with a significant section on AI regulation, yet it explicitly exempts law enforcement from these crucial requirements.
Far greater democratic governance of policing technologies is needed, and until such oversight processes exist for DNA as well as AI technologies, no police service should be using them. We cannot let a situation like Clearview AI happen with DNA data.
The second reason is that, much like with facial recognition technology, we should also be asking questions about the kind of future society we want to live in. Do we want to live in a society that invests in controversial high-tech police surveillance, especially when police services stand accused of not doing enough to tackle systemic racism? Or would the money be better spent elsewhere, such as in alternatives to policing altogether?
Whether or not to hire controversial companies like Parabon is not a decision that should be made internally by the Edmonton Police Service. This is a collective, democratic decision that needs to be made concerning not only the validity of the technology in question, but whether it belongs in the kind of society we want to create.
Canadian police services frequently lay claim to the Peelian principles as their moral foundation. The second principle insists that consent is necessary for police actions to be accepted by the public as legitimate. As with secretly employed facial recognition surveillance, so with controversial DNA technology: how can we consent to something we are not given the power or the information to understand and meaningfully debate?
Decades of scandal affirm that the police cannot be trusted to govern themselves. Professor Kent Roach, in his recent book Canadian Policing: How and Why It Must Change, diagnoses “systemic under-governance and a lack of democratic direction” (p. 80). Now is the time for greater public debate not only about specific technologies like DNA phenotyping, but about the very structure of policing in the 21st century.