Skip to main content

Oral Submission to the Standing Committee on Access to Information, Privacy and Ethics (ETHI) 
 
44th Parliament, 1st Session, Meeting No. 12, March 24, 2022 
  
Brenda McPhail, Ph.D. 
Director, Privacy, Technology & Surveillance Program, Canadian Civil Liberties Association

Thank you for inviting the Canadian Civil Liberties Association to appear before you today.

Facial recognition, or as we often think of it at CCLA, facial fingerprinting—to draw a parallel to another sensitive biometric—is a controversial technology. You will hear submissions during this study that tout its potential benefits and others that warn of dire consequences for society that may come with particular use cases, especially in the context of policing and public safety. Both sides of the debate are valid, which makes your job during this study difficult, and profoundly important and I am grateful you have undertaken it.

The CCLA looks at this technology through a rights lens, a focus which reveals that not just individual and collective privacy rights are at risk in the various public and private sector uses of face surveillance and analysis, but also a wide range of other rights. I know you’ve heard in previous submissions about the serious risk to equality rights raised by this technology that often works less well on faces that are Black, brown, Indigenous, Asian, female or young—that is, non-white, non-male faces. What I’d add to that discussion is the caution that if the technology is fixed, if it becomes more accurate on all faces across the spectrums of gender and race, it may become even more dangerous. Why? Because we know where the surveillant gaze disproportionately falls in law enforcement contexts—those same people. We know who often suffer discrimination in private sector applications—those same people. And in both cases, a perfect identification of those groups who already experience systemic discrimination because of who they are, what they look like, carries the potential to facilitate more perfectly targeted discriminatory actions.

In addition to equality rights, tools that could allow ubiquitous identification would have negative impacts on rights to freedom of association and assembly, freedom of expression, the right to be free from unreasonable search and seizure by the state, the presumption of innocence if everyone’s face becomes a subject in a perpetual police line-up, and ultimately, rights to liberty and security of the person.
So, there’s a lot at stake. It’s also important to understand that this technology is creeping into daily life in ways that are becoming commonplace, and we must not allow that growing familiarity to breed a sense of inevitability. For example, many of us probably unlock our phones with our face. It’s convenient and with appropriate built-in protections, may carry relatively little risk. A similar one-to-one matching facial recognition tool was recently used by the Liberal Party of Canada in its nomination voting process prior to the last federal election—in that case, a much more risky use of a potentially faulty, discriminatory technology because it took place in a process that is at the heart of grassroots democracy. The same functionality in very different contexts raises different risks, highlighting the need for keen attention not just to technical privacy protections, which exist in both the phone and voting example, but contextually relevant protections for the full set of rights engaged by this technology.

So what is the path forward? I hope this study examines IF, not just when and how, FRT can be used in Canada, taking those contextual questions into consideration.

CCLA believes that regulation is required for those uses Canadians ultimately deem appropriate in a fair and free democratic state. Facial recognition for mass surveillance should be banned. For more targeted uses, we continue to call for a moratorium, particularly for policing purposes, in the absence of comprehensive and effective legislation that:

  • provides a clear legal framework for its use,
  • includes rigorous accountability and transparency provisions,
  • requires independent oversight, and
  • creates effective means of enforcement for failure to comply.

A cross-sector data protection law grounded broadly in a human rights framework is necessary, particularly in an environment where the private and public sector are using the same technologies but are now subject to different legal requirements. Targeted laws governing biometrics or ideally, all data-intensive algorithmically driven technologies could be even better fit for purpose, and there are a number of examples globally where such legislation has recently been enacted or is under consideration. We should draw inspiration from those to create Canadian laws to put appropriate guardrails around potentially beneficial uses of FRT and protect people across Canada from its misuse or abuse.

Thank you, and I look forward to your questions.

About the Canadian Civil Liberties Association

The CCLA is an independent, non-profit organization with supporters from across the country. Founded in 1964, the CCLA is a national human rights organization committed to defending the rights, dignity, safety, and freedoms of all people in Canada.

For the Media

For further comments, please contact us at media@ccla.org.

For Live Updates

Please keep referring to this page and to our social media platforms. We are on InstagramFacebook, and Twitter.