CNN | 12/4/2020 | Listen

Detroit Police chief calls Tlaib's comments 'racist' after she suggests facial recognition tech analysts should be African American

Updated 1:28 PM ET, Fri October 4, 2019

(CNN) - Rep. Rashida Tlaib suggested that the Detroit police department should only employ African Americans to identify black suspects from the city's facial recognition technology, according to video captured by The Detroit News -- a comment that the city's police chief called "racist" and "insulting."

The Michigan Democrat made the comment during her tour Monday of the Detroit Police Department's Real Time Crime Center, as she learned about the city's facial recognition technology and the trained analysts that identify suspects through the photos flagged by the software, according to the Detroit News.

"Analysts need to be African Americans, not people that are not," Tlaib said during the tour, according to an edited video from Detroit News, adding, "I think non-African Americans think African Americans all look the same."

Speaking with CNN on Thursday, Tlaib stood by her comments and expressed "worry" that the conversation has been shifted from her main concern -- that the facial recognition technology and relying on analysts, who are prone to human error, is flawed.

In the video of the tour, Tlaib explained, "I've seen it even on the House floor. People calling Elijah Cummings 'John Lewis,' and John Lewis 'Elijah Cummings,' and they're totally different people," referring to the two African American congressmen. "I see it all the time," she said.

Detroit Police Chief James Craig, who was leading the tour, pushed back: "I trust people who are trained, regardless of race, regardless of gender. It's about the training."

"But it does make a huge difference with the analysts," Tlaib said.

Following the tour, Craig, an African-American, took issue with Tlaib's comments.

"We have a diverse group of crime analysts, and what she said — that non-whites should not work in that capacity because they think all black people look alike — is a slap in the face to all the men and women in the crime center," Craig told the Detroit News.

Craig told the newspaper that the department's officers and civilian employees undergo mandatory implicit bias training.

"That's something we train for, and it's valuable training, but to say people should be barred from working somewhere because of their skin color? That's racist," he said to the Detroit News.

He told CNN affiliate WDIV-TV on Wednesday: "The fact that she made that statement, what does that say to the members of this department, who are analysts, who are trained, who are white? That they, in some way, can't do their job professionally? That's insulting."

CNN has reached out to the Detroit Police Department for comment.

Facial recognition technology identifies people from live or recorded video or still photos, typically by comparing their facial features with those in a database of faces, such as mugshots.

The technology could help with tasks ranging from solving crime to checking student attendance at school, but critics are raising privacy issues. Artificial intelligence researchers and civil rights groups, such as the American Civil Liberties Union, are worried about accuracy and bias in facial-recognition systems.

"Our worry is that right now the dialogue is not about how really broken and inaccurate the system is," Tlaib told CNN in a phone interview.

Asked if the tour did anything to allay her concerns about facial recognition technology, Tlaib told CNN on Thursday flatly, "Absolutely not."

On Twitter Wednesday night, Tlaib, who represents a district that includes parts of Detroit, shared links to studies and media reports on the technology and its errors in identifying people of different races and genders.

"The science suggests that people have difficulty identifying faces of people from other races (and) we must take it seriously," Tlaib wrote on Twitter.

She shared a New York Times report that cited a 2018 study done by Joy Buolamwini at the M.I.T. Media Lab, which found that facial analysis software has trouble identifying women of color — with error rates of up to nearly 35% for women with darker skin tones.

"At the end of the day, I was elected to serve my residents and I cannot in good conscience sit by while (Craig) shows me inaccurate facial recognition technology that's been already deployed, that run the risk of false arrests and over policing, and I wanted to have a real dialogue with him," Tlaib told CNN, adding she hopes Detroit Police will read the studies.

Detroit is a majority-black city with a nearly 80% African American population.

Craig told WDIV-TV on Wednesday that if he had made a similar statement as Tlaib, "people would be calling for my resignation, right now."

The Michigan congresswoman had been invited by the police department to tour the facility and see how the software worked, after she criticized the technology on Twitter.

"You should probably rethink this whole facial recognition bull----," Tlaib said in a August 20 Twitter post.

"Before you criticize the software, come to our Real Time Crime Center to see how we @detroitpolice responsibly use it in efforts to identify criminals involved in violent crimes," the Detroit Police replied.

Speaking to reporters back in August, Craig acknowledged the concerns about facial recognition, according to video posted by the Detroit Police Department.

"I do understand the pause in using this technology .... This is a tool only. And this technology alone cannot assist us solely on identifying the suspect, without the person behind the technology," Craig said, stressing the importance of analysts.

He added that relying on the technology alone, "we would probably misidentify 98, 99% of the time."

Chad Marlow, the senior advocacy and policy counsel with the ACLU, told CNN on Thursday that "facial recognition technology is so faulty, it doesn't matter who you put in place to review it."

"The technology is going to create false positives, those false positives are going to be suggestive on whoever it is who is reviewing the matches," Marlow said.

He added, "It's really not a technology that can be responsibly used. With all due respect to the Police Chief, I think that is the problem in his analysis."

Marlow told CNN that the public has a misconception that the software uses high-quality, high-resolution photos, when the reality is otherwise.

"The problem is that when you use this technology to try to identify someone who you think may be involved with a crime, where the picture may not be good or may be good, a lot of the times you are going to have false identifications, especially when it involves a person of color," Marlow told CNN. "... And that is a heightened concern in a place like Detroit."

According to the Detroit News video, during the tour, the chief explained how the technology was used to help apprehend a violent offender, that witnesses had identified.

"We never talk about the victims. What about the victim's rights? What about the family of the victims? What about their justice?" he asked Tlaib, according to the Detroit News video.

"The warrant wasn't issued solely based on this?" she asked.

"No. It wasn't," said Craig.

Tlaib told CNN that she thinks it's disingenuous when Craig talks about victims.

"He's actually expanding the number of victims, because now we're going to misidentify people that are going to become victims of a broken, racist technology," she said Thursday.

In another portion of the hour-long tour, Craig also showed Tlaib a misidentification of a suspect by the facial recognition software -- a black woman had been matched to a photo of a black male suspect.

Craig then pointed out how his analyst knows that two are not a match, according to the video. He gave assurances to Tlaib that the software is a "tool" and is "never" used solely to prosecute a suspect, the video showed.

While Detroit has moved ahead with using the technology, San Francisco and Oakland, California, and Somerville, Massachusetts, have banned city departments — including police — from using the software. California lawmakers last month passed a bill that will temporarily ban state and local law enforcement from using facial-recognition software in body cameras. The bill is awaiting action from California Gov. Gavin Newsom, who must decide whether or not to sign it into law by October 13. Facial recognition in body cameras is banned in Oregon and New Hampshire.

Currently, there are no federal rules regulating artificial intelligence in general, nor facial-recognition technology in particular.

Several bills, which would limit the use of the technology, are being crafted in Congress, but none have been brought for a vote.

Tlaib is a sponsor of two pieces of legislation — a House bill that would ban facial and biometric recognition in public housing and another bill that would prohibit federal funding from being used for purchasing or using the technology.

This story has been updated to include more information on studies about facial recognition.


© 2019 Cable News Network. Turner Broadcasting System, Inc. All Rights Reserved.

Listen to CNN (low-bandwidth usage)

Go to the full CNN experience