Congresswoman Rashida Tlaib Calls Attention to the Urgent Issue of Racism in Facial Recognition Technology


Reading Time: 4 minutes

The way police departments use facial recognition is a big deal.

Democratic Congresswoman Rashida Tlaib who represents Michigan was recently in a heated debate with Detroit Police Chief James Craig over the sufficiency of the police department’s facial recognition program. The biggest issue at hand is whether non-Black analysts were capable of identifying individuals in the police’s facial recognition system. Detroit’s population is 80 percent Black so this should be a cause of concern for Black residents in the city.

Recently Tlaib, who is Palestinian, visited the police department and stated, “analysts need to be African Americans, not people that are not.” Craig, who is Black, has white analysts on his staff and did not agree with Tlaib’s comments.  

“If I had made a similar comment people would be outraged,” said Detroit Police Chief James Craig, to local news station 7 Action News.  “They would be calling for my resignation.”

Although Tlaib is a brown woman, her comments and concerns are based on studies around facial recognition software that show it is biased toward black and brown people. MIT Researcher Joy Buolamwini is one of the leading voices bringing light to this through her advisory group Algorithmic Justice League. Buolamwini and fellow researchers have called out companies such as IBM, Amazon, and Microsoft to work on making their systems less biased toward Black and brown people. 

FILE – On this Jan. 17, 2019, file photo, Rep. Rashida Tlaib, D-Mich., speaks at a news conference on Capitol Hill in Washington. Democrats on Monday, May 13, defended Tlaib after President Donald Trump and his allies mischaracterized her remarks about the Holocaust to accuse her of anti-Semitism. Tlaib told a Yahoo News podcast that she gets “a calming feeling” when she thinks of how her Palestinian ancestors suffered under the creation of the state of Israel. (AP Photo/Andrew Harnik, File)

In 2018, Buolamwini’s research found that facial recognition software mostly misidentified darker-skinned women at an error of up to 34.7 percent. If the software results are faulty, it makes sense that a Black person would be better able to identify the error than someone who is non-Black. For lighter-skinned males, the error was less than 1 percent. With this knowledge, civil rights groups such as Color of Change have launched petitions against the FBI for using software sold by big tech companies that can cause more harm to people of color.

“This means that Black and Latinx people are much more likely to be misidentified and treated as a threat to law enforcement, biasing law enforcement agents before an encounter even begins,” the organization wrote in a petition in February. The petition also stated: “At a time when Black and immigrant communities are engaged in high-profile organizing, handing over this surveillance technology to government agencies and law enforcement threatens our freedom and lives.”

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini’s research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP Photo/Steven Senne)

Fight for Our Future, a human rights organization that is calling to ban this technology created a map to track cities around the country that have already implemented facial recognition surveillance. In other metropolitan areas like New York City, the NYPD has been called out for using facial recognition to track children’s faces.

According to the American Civil Liberties Union, facial recognition technology can be “used for general surveillance in combination with public video cameras, and it can be used in a passive way that doesn’t require the knowledge, consent, or participation of the subject.” The definition of facial recognition technology can vary based on the institution using the software and the rules governing how they can use it. 

Detroit implemented facial recognition software in 2017 and in September, the Detroit police oversight board put restrictions on how the police department could use the software. The police department can only analyze photo stills not live or recorded videos; they can not use the software to determine someone’s immigration status; and it is only to be used in identifying individuals who commit sexual assault, aggravated assault, robbery or homicide. Yet the issue of racial bias has not yet been solved. Amanda Alexander, the executive director of the Detroit Justice Center, said facial recognition is mass profiling and Black people don’t need it, according to a report from the Detroit Free Press.

“Rather than investing millions of dollars in facial recognition technology that instills fear and targets communities of color, we should be investing in services and resources so that people can prosper.”

advanced divider
advanced divider