Dallas Police Might Use Facial Recognition From Controversial Company | Dallas Observer
Navigation

Dallas Police Might Use Controversial Company for Facial Recognition

The AI technology is being used in other North Texas cities and will likely soon begin to recognize faces in Dallas.
The department already has a grant to pay for the facial recognition technology.
The department already has a grant to pay for the facial recognition technology. Brian Sevald
Share this:
Dallas likes to lead the way when it comes to a lot of things. But Dallas Police Chief Eddie García said at a Public Safety Committee meeting yesterday that he didn’t mind being a little late to the party in the area of facial recognition technology. 

“I’m glad that we’ve actually taken time to see what has been out there, how we could better our policies and things of that nature when it comes to facial recognition,” García told the committee.

Dallas Police Maj. Stephen Williams told the committee more about how the department could use facial recognition technology in its investigations.

Williams said investigative facial recognition technology is a means of ascertaining and confirming an individual’s identity. It uses open-source information as comparison images, employing artificial intelligence to identify and compare millions of these images. The tech is used by police agencies across the country.

The Texas Department of Public Safety uses it, for example. Facial recognition technology is also used in Arlington, Fort Worth and McKinney, according to DPD. “The solution is going to scrape the internet for all of these publicly posted photos that everybody posts on social media, the news media, everything, and use this for comparative analysis,” Williams added.

“This is just a piece of the puzzle." – DPD Major Stephen Williams

tweet this
The point is to provide investigative leads in violent offenses and imminent threats to public safety, and to help identify deceased or incapacitated individuals. If the tech is implemented in Dallas, the police department says it would ensure that its use complies with all applicable data security laws and that residents’ privacy rights and civil liberties are held in the highest regard. DPD said it would also require high standards of training and peer review for the facial recognition technology and hopes to ensure a transparent and thorough oversight of its use.

How Does Police Facial Recognition Work?

There are multiple steps involved in the process. An investigating detective would receive a supervisor's approval to submit an investigative facial recognition request. That request would then be sent to a trained analyst to review the image the detective wants to run a search on to make sure all criteria for the use of investigative facial recognition are met. It's worth noting here that, according to the police, these analysts undergo 32 hours of FBI training, which includes components touching on implicit bias and how to avoid misidentification.

If the image meets all the criteria, the analyst would process it through the facial recognition software. The results of that search would be reviewed by a separate trained analyst before being sent back to the investigating detective.

“It’s really putting two sets of eyes on the results to ensure that we’re getting a positive or negative, that there’s no discrepancies in the identification, really to mitigate any misidentification that may be occurring,” Williams said. “If there is a difference between the two analysts, it will go to a supervisor for final review. The supervisor then will determine if it's a negative or positive result from that reveal.”


Why Is Facial Recognition Controversial?

There are some concerns about the software over privacy and the potential to misidentify people. The department says, however, the technology would not be used as a positive identification or as a probable cause for arrests without additional corroboration and investigation. “It is merely another lead that a detective will take to follow in the criminal investigation process,” Williams said.

The police department also recognizes that there are some concerns regarding free speech, freedom of assembly and freedom of religion when it comes to the use of this technology. But according to the department, general orders would forbid the use of facial recognition for First Amendment activities. A criminal predicate or public safety threat would be required for all uses. The department also says it won’t be used to identify people on live feeds or during live-streamed events. “We have to have a criminal offense before we start doing things,” Williams said.

There is also the worry that the use of this technology will lead to an over-reliance on video evidence. But, DPD says this shouldn’t be a problem because general orders make it clear that the technology is only one element of an investigation that must be used in conjunction with additional corroborating evidence. “This is just a piece of the puzzle,” Williams said.

The use of the technology will be logged and reported, allowing data collection that could facilitate further research. Data collection is governed by federal regulations for operating federally funded, multi-jurisdictional criminal intelligence systems. During the meeting, Garcia chimed in to quell some of the privacy concerns.

“This is not license plate readers for humans,” he said. "This is not what this is. This is strictly based on a criminal offense having occurred.”

Who Will Provide Facial Recognition Technology to the Dallas Police?

A company called Clearview AI, an industry leader in the field of facial recognition, is being looked at by the city to implement the facial recognition technology. A quick search of the company turns up a few notable news stories about it. Some may sound flattering, others not so much.

One TIME article called Clearview AI Ukraine's secret weapon against Russia. But last year, the nonprofit Consumer Watchdog urged California’s attorney general to investigate the company for allegedly selling images to police departments without consent, saying the company's facial recognition software "represents a clear and present danger to our societal norms and our privacy."

Also last year, New York Times tech reporter Kashmir Hill told NPR the company could spell the end of privacy. Another story points out that police in Miami used Clearview AI to identify a homeless man who refused to give his name. The man was arrested but charges were dropped because the officer lacked probable cause to make the arrest.

Nevertheless, the company has achieved top ranking for accuracy in testing by the National Institute of Standards and Technology. The company retains records of all searches.

The Public Safety Committee approved the department's use of the technology on Monday, making way for the police department to soon implement a general order and plan for its use by investigators and to facilitate their training.

Chief García seemed convinced this development is a positive one, telling the committee: “I can tell you that it will be a game changer for our hard-working detectives to have this technology.”
BEFORE YOU GO...
Can you help us continue to share our stories? Since the beginning, Dallas Observer has been defined as the free, independent voice of Dallas — and we'd like to keep it that way. Our members allow us to continue offering readers access to our incisive coverage of local news, food, and culture with no paywalls.