Dutchess, Putnam among police agencies that tested photo recognition app
The Dutchess and Putnam County sheriff’s offices are among the more than 1,800 public agencies that have in recent years tested or used a massive facial-recognition database that civil rights activists say violates privacy and disproportionally targets people of color.
Clearview AI, a 4-year-old New York City startup, has amassed an archive of 3 billion images and promotes its search software to police, immigration officials, the military and schools as nearly 99 percent accurate in identifying criminal suspects. Police and other taxpayer-funded agencies in every state except Vermont have used the software.
“Clearview is like Google Search for faces,” the firm’s promotional materials boast. “Just upload a photo to the app and instantly get results” of possible matches culled from the “public internet, including news media, mugshot websites, public social media and many other open websites.” Agencies should “expect to receive high-quality leads with fewer resources expended.”
Facial-matching technology applied to social media accounts — not all of it developed by Clearview — has helped identify suspects in the Jan. 6 Capitol riot, although those charged so far from the Hudson Valley have been identified the old-fashioned way, by tips to the FBI from co-workers and former classmates and friends.
In a series of investigative reports based on internal Clearview documents provided by an anonymous source and public records, BuzzFeed News detailed earlier this year how the firm aggressively marketed its product to police beginning in 2018. The news outlet created a database of law enforcement agencies that have subscribed to or tested Clearview and noted that, as of February 2020, the company began to require a supervisor’s approval before allowing individual officers to create trial accounts.

Based on data obtained by BuzzFeed, there are at least 61 law enforcement agencies in New York that have used Clearview technology, including 23 in the Hudson Valley. (NYCLU)
According to the database, the Dutchess County Sheriff’s Office used Clearview between 101 and 500 times before February 2020, while the Putnam County Sheriff’s Office used it between 501 and 1,000 times. Neither the Beacon nor Cold Spring police show up as having used the software.
In Dutchess, Michael Cominsky, a High Intensity Drug Trafficking Areas analyst, began a 30-day trial of the Clearview software in July 2019, according to emails obtained by The Current through a Freedom of Information Law request.
When the officer asked what to expect when running a search on a facial image, a sales representative responded that “you are free to run yourself, friends, co-workers, George Clooney, Donald Trump, Abraham Lincoln, or any other face you have as many times as you want! You can test what happens if you wear glasses, make faces, or whatever, as well. In fact, I recommend it so that you get a feel for what the software can do and cannot do.
“I can tell you another officer tried it on Mickey Mouse and it did not work, however,” the sales rep wrote.
Another email said the company usually charges $2,500 annually for an individual account, but because “we understand you guys are a smaller department,” a representative said Dutchess could pay $2,000 per year for two accounts.
According to the emails, Clearview extended Cominsky’s trial several times — sometimes at his request, but also after the officer indicated there was not money in the budget for a subscription. In May 2020, Clearview activated a trial for Cominsky’s supervisor, Lt. Stephen Reverri, while extending his.
A spokesperson for the Sheriff’s Department, Lt. Shawn Castano, said on Wednesday (July 28) that the county has not used Clearview since Cominsky’s trial ended. No arrests were made because of the searches, and the agency does not plan to budget funds for a subscription. “We haven’t used it, and we don’t intend to use it,” Castano said.
In Putnam County, spokesperson Capt. Lisa Ortolano confirmed in an email this week that the Sheriff’s Office has used the service, and “according to the crime analyst who works in Putnam County, it has been very helpful.” Ortolano could not immediately confirm whether the agency made any arrests using the technology, or if Putnam had paid for the service.
The effectiveness, and ethics, of facial-recognition technology have been the subject of great debate.
Dutchess Legislator Nick Page, who represents three wards in Beacon, said this week that Clearview marketing its software “under the radar” could present problems, especially when the company allowed individual officers to test the product without a supervisor’s approval.
“The most important thing is that there’s public awareness of what’s being used and why,” Page said.
Det. Sgt. Jason Johnson, a spokesperson for the Beacon Police Department, said he sees facial-recognition software as “another useful tool for my toolbox. I certainly would not make an arrest based off facial-recognition results, but it may put you one step closer to solving a crime or give you a lead where you have none.”
Several other vendors also market facial-recognition software to law enforcement, but Clearview has risen to prominence because of its “scraping” billions of photos from social media and securing large contracts, including with U.S. Immigration and Customs Enforcement.
The company’s technology “allows anyone who operates it to track where we go and who we associate with,” said Daniel Schwarz, a privacy and technology strategist with the New York Civil Liberties Union.
According to the BuzzFeed data, there are at least 61 law enforcement agencies in New York state that have used the Clearview platform, including 23 in the Hudson Valley [see map]. The New York City Police Department had registered 11,000 searches, the most of any agency, as of early 2020.
That’s problematic because numerous studies have shown facial-recognition software to have a much higher rate of error than what Clearview publicizes, especially among people of color, Schwarz said. In 2019, the National Institute of Standards and Technology, an agency of the U.S. Department of Commerce, concluded that Asian and African American people are as much as 100 times more likely than white men to be misidentified by the technology.
Native Americans and African American women are particularly vulnerable to misidentification, the study found.
The NYCLU is supporting a bill introduced in the state Legislature this year that would prohibit the use of biometric surveillance, including facial recognition, by police and create a task force to study the effects of the technology on minority populations.
Many agencies are evaluating the effectiveness of facial-recognition programs, said Castano, the Dutchess County lieutenant. “But right now, with so much video being out there, we still have pretty good success with posting a photo of someone on social media and having the public provide identification,” he said. “We’re continuing to utilize that.”