3d images

Police use DNA to generate 3D images of suspects they’ve never seen

On Tuesday, the Edmonton Police Service (EPS) shared a computer-generated image of a suspect they created with DNA phenotyping, which he used for the first time in hopes of… ‘identify a suspect in a 2019 sexual assault case. Using DNA evidence from the case, a company called Parabon NanoLabs created an image of a young black man. The composite image did not take into account the suspect’s age, BMI or environmental factors, such as facial hair, tattoos and scars. EPS later released this image to the public, both on its website and on social media platforms, including its Twitter, saying it was “a last resort after all avenues of investigation have been exhausted”.

EPS’s decision to produce and share this image is extremely damaging, say privacy experts, raising questions about racial bias in DNA phenotyping for forensic investigations and privacy violations. privacy of DNA databases that investigators can search.

In response to the EPS tweet about the image, many privacy and criminal justice experts responded with outrage at the irresponsibility of the police department. Callie Schroeder, Global Privacy Counsel at the Electronic Privacy Information Center, retweeted the tweet, questioning the usefulness of the image: “Even if this is new information, what are you going to do with it? ? Ask every black man about 5ft 4in you see? …that’s not a suggestion, absolutely don’t do it.

“Wide dissemination of what is essentially a computer-generated guess may lead to mass surveillance of any black male approximately 5’4″, both by their community and by law enforcement,” he said. Schroeder told Motherboard, “This group of suspects is far too large to warrant surveillance or suspicion increases that could apply to thousands of innocent people.

The victim of the case had only a limited description of the suspect, “describing him as 5’4″, with a black toque, pants and a sweater or hoodie” and “as having an accent”, which gives a vague and indistinguishable profile.

“Releasing one of these Parabon images to the public like the Edmonton police recently did is dangerous and irresponsible, especially when that image involves a black person and an immigrant,” said Jennifer Lynch, director of oversight litigation for the ‘Electronic Frontier Foundation at Motherboard. “People of color are already disproportionately targeted for criminal investigations, and this will not only exacerbate this problem, it could lead to public vigilance and real harm to misidentified people.”

The criminal justice and policing system is fraught with racial bias. A black person is five times more likely to be stopped by police without cause than a white person, and black people, Latinx people and people of color are more likely to be stopped, searched and suspected of a crime even when no crime has been committed.

Viewing the composite image without context or knowledge of DNA phenotyping may mislead people into thinking the suspect looks exactly like the DNA profile. “Many members of the public who see this generated image will not know that it is a digital approximation, that age, weight, hairstyle and face shape can be very different and that the accuracy of skin/hair/eye color is approximate,” Schroeder said. said.

In response to criticism after the image was released and the use of DNA phenotyping, the Edmonton Police Department shared a press release on Thursday morning, in which it announced that it had removed the composite image from its website. and its social media.

“While the tension I felt about this was very real, I prioritized the investigation – which in this case involved pursuing justice for the victim, herself a member of a racialized community, rather than the potential harm to the black community. This was not an acceptable compromise and I apologize,” wrote EPS COO Enyinnah Okere.

Parabon NanoLabs has sent Motherboard a number of case studies where DNA phenotyping alone has solved murder and assault cases. However, the case studies do not address broader concerns, which are much harder to measure, such as the number of innocent people interviewed before the final suspect was arrested and how the suspect’s image may have affected public racial prejudice.

According to Parabon, he has worked on hundreds of law enforcement investigations. On his site are a number of case studies, many of which show the comparison between the DNA profile and the actual photo of the suspect. There are some similarities between the two photos, in that they both reflect the same race, gender, eyes, and hair color. Often, however, the resemblance between the generated image and the suspect ends there.

APD-NM-08-169900-current.jpg

We make predictions only from DNA, so we only have a limited amount of information. And so when we make those predictions, that’s a description and those are there. If the police had a witness, they wouldn’t need us,” Dr. Ellen Greytak, director of bioinformatics and technical lead for the Snapshot division of Parabon NanoLabs, told Motherboard. “We provide facts, like a genetic witness, providing that information that detectives can’t get otherwise.”

“It’s like the police got a description from someone who, you might know, didn’t see him close enough to see if he had any tattoos or scars, but did. describes the person. What we find is that it can be extremely useful, especially in determining who it might be and weeding out people who really don’t fit that prediction,” Greytak said. “In these cases, by definition, they still have DNA and so we don’t have to worry about the wrong person being picked up because they would still match the DNA.”

According to Greytak, the technology creates the composite image by running the suspect’s DNA through machine learning models that are built on thousands of people’s DNA and their corresponding appearances.

“The data we have on people with known appearances comes from a variety of sources, some of them are publicly available, you can request access. Some of them are from studies we’ve conducted, where we collected this information,” Greytak said.

The DNA dataset used to create these composites raises more red flags about DNA profiling privacy issues. The “variety of sources” includes GEDmatch and FamilyTree DNA, which are free and open source genealogy websites that give you access to millions of DNA profiles.

“People should be aware that if they send their DNA to a consumer-facing company, their genetic information may fall into the hands of law enforcement for use in criminal investigations against them or their genetic relatives. None of that data is covered by federal health privacy rules in the United States,” Lynch said. “While 23 and Me and Ancestry generally require warrants and limit disclosure of their users’ data to law enforcement law enforcement, other mainstream genetic genealogy companies such as GEDmatch and FamilyTree DNA provide almost wholesale access for law enforcement to their databases.”

Parabon NanoLabs claims that the images they generate are not based on race, but on their genetic ancestry. “When we talk about a person’s genetic ancestry, or their biogeographic ancestry, [which] is the term we use for that, it’s a continuous measurement against race, which is categorical,” Greytak said.

However, the researchers say that taking family background into account when doing DNA profiling, as Parabon NanoLabs does, is not an objective measure because it causes general populations to be viewed as more criminal than others.

“While the conventional use of DNA profiling has been aimed primarily at the individual suspect, more recently a shift in interest in forensic genetics has taken place, in which the population and family to which an unknown suspect may belong, has occupied center stage,” researchers led by anthropologist Amade M’charek wrote in a study titled “The Problem with Race in Forensic Identification.” “Making inferences about the phenotype or familial relationships of this unknown suspect produces suspicious populations and families.”

After a 2019 Buzzfeed investigation found that GEDmatch allowed police to upload a DNA profile to investigate a serious assault, the site changed its policies so users had to sign up for law enforcement searches. order. Still, investigators can use a number of similar databases to upload the suspect’s DNA and map the suspect’s family tree until they can identify the suspect’s true identity.

A notorious case in which this tactic proved successful was finding the Golden State Killer, a serial killer named Joseph James DeAngelo. After uploading his DNA to GEDmatch, investigators were able to find one of his family members who was already in the system and trace DeAngelo decades after he committed the crime.

Many police departments have collected DNA from innocent people and people who commit petty crimes, such as Orange County, which has a database of more than 182,000 DNA profiles, almost all of people charged with misdemeanors. , including petty theft or driving with a suspended license. Several attorneys have filed a lawsuit against the county, claiming the database violates California law. The lawsuit says handing over DNA is a “coercive bargain” because those who hand over a DNA sample will receive lighter sentences or even a dismissed case.

A similar lawsuit has been filed in New York by the Legal Aid Society, which accuses the city of operating a DNA database that violates state law and constitutional protections against unreasonable searches. These DNA databases again perpetuate the pervasive racial biases of the criminal justice system. Since people of color, especially blacks and Latinos, make up 75% of those arrested over the past decade in New York City, the DNA database places crime more heavily in marginalized demographics.

Although race is not necessarily measured by DNA phenotyping, race is produced semiotically by the visual nature of composite DNA profiles and in the already biased DNA datasets from which these profiles are derived. The use of DNA phenotyping may have opened up a few cold cases, but we have to ask ourselves: at what cost.

This article is part of State of Surveillance, made possible by a grant from Columbia University’s Ira A. Lipman Center for Journalism and Civil and Human Rights in conjunction with Arnold Ventures. The series will explore the development, deployment and effects of surveillance and its intersection with race and civil rights.