"Police cannot substitute one person's biometrics for another's, regardless of whatever passing resemblance they may have."
Sometimes you gotta go where everybody knows your face.
Police are being criticized for using a photo of Woody Harrelson to catch a beer thief.
A report published by Georgetown Law Center on Privacy and Technology revealed that in 2017, NYPD used the "Cheers" star's likeness in a facial recognition program to hunt down a doppelganger who stole a six-pack from a CVS.
Stills from the security camera footage were partially obscured and highly pixelated, and returned no matches in the database. But when a detective noticed the similarities with the three-time Oscar nominee, they fed high-res photos of him into the system instead.
"This celebrity 'match' was sent back to the investigating officers, and someone who was not Woody Harrelson was eventually arrested for petit larceny," the report reads.
While the creative trick appeared to have worked in this case, the report's author Senior Associate Clare Garvie warned that it set a dangerous precedent and threatened civil liberties.
"The stakes are too high in criminal investigations to rely on unreliable — or wrong—inputs," she wrote. "It is one thing for a company to build a face recognition system designed to help individuals find their celebrity doppelgänger or painting lookalike for entertainment purposes. It's quite another to use these techniques to identify criminal suspects, who may be deprived of their liberty and ultimately prosecuted based on the match."
"Unfortunately, police departments' reliance on questionable probe photos appears all too common."
She claimed NYPD also used a photo of a New York Knicks player to search its face recognition database for a man wanted for assault in Brooklyn; cops would not tell her the player's identity.
Garvie summarized her thoughts on the process in the title of the report: "Garbage In, Garbage Out."
"'Garbage in, garbage out' is a phrase used to express the idea that inputting low-quality or nonsensical data into a system will produce low-quality or nonsensical results," she explained. "It doesn't matter how powerful or cleverly-designed a system is, it can only operate on the information it is provided — if data is missing, the system cannot operate on it. Any attempt to reconstruct or approximate missing data will necessarily be a 'guess' as to what information that data contained.
"Worse, if data is wrong — like a photo of someone other than the suspect — the system has no way to correct it. It has literally no information about the suspect, and can't make it up.
She admitted face recognition technology has improved immensely in the past two years, and will continue to do so. But without regulation on the standard of what photos are put into the system to search for matches, it is at best useless, and at worse dangerous.
"It doesn't matter how good the machine is if it is still being fed the wrong figures—the wrong answers are still likely to come out," she wrote.
While police currently only use such techniques to find leads where there are none, Garvie pointed out that law enforcement is moving towards making actual arrests based on facial recognition. She quoted FBI Section Chief for Biometric Services Bill McKinsey, who claimed the Bureau were "pretty confident we're going to have face [recognition] at positive ID in two to three years."
She said the same of forensic sketches, pointing to several examples of facial recognition fingering the wrong person based on an artist's interpretation of the victim's potentially hazy recollection.
In her conclusion, she insisted police stop using forensic sketches, poor quality or incomplete photos for facial recognition — and definitely no more celeb pics.
"Stop using celebrity look-alike probe images. Face recognition is generally considered to be a biometric, albeit an imperfect one," she wrote. "Police cannot substitute one person's biometrics for another's, regardless of whatever passing resemblance they may have."