Skip to main content

Facial Recognition Technology

officer of the privacy commissioner of canada

Dr. Christian’s research on facial recognition technology is generously funded by the Office of the Privacy Commissioner of Canada, reflecting a commitment to understanding and addressing the multifaceted challenges posed by this emerging technology.

Mitigation Race, Gender and Privacy Issues in AI Facial Recognition Technology

Recent years have witnessed the increasing use of Artificial Intelligence Facial Recognition Technology (AI-FRT) in both the private and public sectors. The use of AI-FRT has been plagued by issues of privacy, as well as racial and gender bias, particularly relating to Black people and people of color.

Read More

The privacy issues arise, among others, from the collection and use of big data in the development and deployment of AI-FRT. The racial and gender issues stem from the disproportionate data from racial and gender demographics, which has resulted in the technology misidentifying or failing to identify individuals from particular gender and racial groups. Research studies have shown that AI-FRT technology is 99% accurate in identifying White male faces, while the rate for people of color, especially women of color, is in the lower range of 65%. The race, gender, and privacy impacts of AI-FRT could affect the Charter rights of Canadians, e.g., the right to personal liberty, as well as the right to freedom from discrimination on the basis of race or gender.

This research project focuses on identifying and examining the race, gender, and privacy issues primarily associated with the development of AI-FRT, and its utilization by both private and public sectors in Canada. Additionally, the research aims to develop a framework and guidelines to address the impacts on race, gender, and privacy resulting from the development and deployment of AI-FRT by private sector developers in Canada. Another key objective is to explore potential reforms to the Personal Information Protection and Electronic Documents Act (PIPEDA) to legislatively address the race, gender, and privacy impacts arising from the private sector’s development and deployment of AI-FRT.

The research is funded by a $50,000 grant from the Office of Privacy Commissioner of Canada.

Research Papers

Gideon Christian, “The New Jim Crow: Unmasking Racial Bias in AI Facial Recognition Technology within the Canadian Immigration System” (Forthcoming in McGill Law Journal)

Facial Recognition Technology (FRT) is an AI-based biometric system that identifies individuals by analyzing unique facial features through advanced algorithms, essentially creating a “facial signature” for each person. Its adoption has surged across Canadian public and private sectors, from criminal and immigration law enforcement, to academic proctoring, fraud prevention, and more. However, this widespread application raises concerns about perpetuating historical racial biases, drawing parallels to the post-Civil War era’s racial inequalities, dubbed “the new Jim Crow.”

This research paper delves into the implications of FRT, especially within Canadian immigration enforcement, scrutinizing its role in refugee revocation proceedings and its potential to undermine fairness and integrity in immigration processes. By examining Federal Court of Canada litigation and integrating a historical methodology, the paper explores the racial bias inherent in FRT and its potential to exacerbate systemic racial injustices.

The research paper draws parallels between contemporary racial biases in FRT and historical systemic racism, emphasizing the need for a critical re-evaluation of technological advancements to ensure equity and fairness. Through a detailed analysis of current practices and judicial decisions, this research contributes to the broader discussion on technology, ethics, and racial justice, advocating for responsible development and deployment of AI technologies in areas critical to social equity.

Blogs

Racial bias safeguards missing from Bill C-27's Artificial Intelligence Data Act draft, says U of C professor

University of Calgary assistant professor of AI and law, Gideon Christian, has sent a letter to the House of Commons committee reviewing Bill C-27 to flag major flaws in AI regarding racial bias, especially affecting people of colour.

AI facial recognition technology: the black box hurting Black people

AI facial recognition technology: the black box hurting Black people

Ms. AB, a Black woman from Africa, made a successful refugee claim in Canada. Years later, she walked into the licensing office of the Ontario Ministry of Transportation (MTO) to have her photo taken for her driver’s licence.

#AI Facial Recognition Technology in the Retail Industry

One summer day in 2023, I entered a Walmart store in Calgary, Alberta, and purchased three standing fans. Upon assembling the fans at home, I discovered that one was malfunctioning. I immediately decided to return it to the store.

AI Facial Recognition Technology in the Canadian Immigration System

Rapid adoption of Artificial Intelligence (AI) technology is permeating the Canadian public sector from criminal to immigration law enforcements. AI in now increasingly being deployed and used in various aspect of the Canadian immigration system.

In the News

Law professor tackles racial bias in AI facial recognition technology

AI bias researcher

Law professor explores racial bias implications in facial recognition technology

Racial bias safeguards missing from Bill C-27's Artificial Intelligence Data Act draft, says U of C professor

Podcasts

Webinars

AI Facial Recognition Technology The Black Box Hurting Black People

Facial Recognition Research