Skip to main content

Canada lacks an appropriate legal framework to regulate facial recognition technology, and without it, there is such high risk of individual and social harm that we need a moratorium on its use. CCLA made this argument in submissions to the recent study into facial recognition technology (FRT) by the Standing Committee on Access to Information, Privacy and Ethics (ETHI). We strongly agrees with the Committee’s conclusions in their report released October 4, 2022: 

The Committee’s study confirmed that Canada’s current legislative framework does not adequately regulate FRT and AI. Without an appropriate framework, FRT and other AI tools could cause irreparable harm to some individuals. 

The Committee is therefore of the view that, when FRT or other AI technology is used, they must be used responsibly, within a robust legislative framework that protects Canadians’ privacy rights and civil liberties. Since such a legislative framework does not exist at the time, a national pause should be imposed on the use of FRT, particularly with respect to police services. 

The Committee makes 19 generally strong recommendations. The highlights include recommendations prohibiting procurement of unlawfully-obtained personal information and to require public reporting when FRT or other algorithmic tools, including free trials, are acquired by government institutions. These recommendations would help prevent a future scandal like that created when the RCMP secretly used Clearview AI software, subsequently determined by a joint investigation of Privacy Commissioners to be illegal under Canadian law. 

Other key recommendations are directed towards accountability and transparency, include creating a public AI registry of tools used by entities operating in Canada, increasing engagement of civil society stakeholders in existing AI impact assessment processes, and establishing “robust policy measures within the public sector for the use of facial recognition technology which could” [we would say should] “include immediate and advance public notice and public comment, consultation with marginalized groups and independent oversight mechanisms.” 

To address the well-known risks of bias and discriminatory impacts of FRT systems, there are recommendations calling for the government to invest in studying the impact of artificial intelligence on different groups, and to ensure “full and transparent disclosure of racial, age or other unconscious biases that may exist in facial recognition technology used by the government, as soon as the bias is found in the context of testing scenarios or live applications of the technology, subject to national security concerns.” Of course, the national security carve out is problematic, given that risks of discrimination run particularly high in contexts like border security. 

There are a range of recommendations addressing the need to legislate protections including: 

  • Defining acceptable uses of facial recognition technology or other algorithmic technologies and prohibit other uses, including mass surveillance; 
  • Requiring, before adopting or using FRT that agencies consult seek the Privacy Commissioner, and file privacy  impact assessments; 
  • Updating the Canadian Human Rights Act to ensure that it applies to discrimination caused by the use of facial recognition technology and other artificial intelligence technologies  
  • Implementing a right to erasure of personal information including images; 
  • Implementing an opt-in-only requirement for the collection of biometric information by private sector entities and not allowing those entities to require a biometric as a condition of service; 
  • Strengthening the ability of the Privacy Commissioner to levy meaningful penalties for breaking the law. 
  • Developing a regulatory framework concerning uses, prohibitions, oversight and privacy of facial recognition technology, including proactive oversight measures. 

And finally, one of the most important recommendations addresses the need to protect photos of people, either online or captured in public spaces, from being scooped up non-consensually for use in AI systems. Recommendation 17 states: That the Government of Canada amend the Privacy Act and the Personal Information Protection and Electronic Documents Act to prohibit the practice of capturing images of Canadians from the internet or public spaces for the purpose of populating facial recognition technology databases or artificial intelligence algorithms.  

The ETHI Committee has done a good job of synthesizing and reflecting the concerns of the witnesses who appeared before them and put forward recommendations that would go a long way towards protecting people across Canada from rights-infringing uses of FRT and other AI-driven technologies. It is now up to the government to take up the recommendations. The report is timely, with the new private-sector privacy bill C-27 working its way toward Committee. Amendments to that bill could and should be made in response to some of these recommendations, and Privacy Act reforms are similarly indicated. While we wait for that legislation, the moratorium on federal policing services and Canadian industries use of FRT recommended by ETHI and supported by a wide range of witnesses, including CCLA, should be called immediately. As the Committee notes, harms are imminent without swift action.  

About the Canadian Civil Liberties Association

The CCLA is an independent, non-profit organization with supporters from across the country. Founded in 1964, the CCLA is a national human rights organization committed to defending the rights, dignity, safety, and freedoms of all people in Canada.

For the Media

For further comments, please contact us at media@ccla.org.

For Live Updates

Please keep referring to this page and to our social media platforms. We are on InstagramFacebook, and Twitter.

en_CAEnglish (Canada)