Some police departments in Canada are using facial recognition technology to solve crimes, while others say human rights and privacy concerns prevent them from using these powerful digital tools.
It is this uneven application of the technology and the lax rules governing its use that are leading artificial intelligence (AI) legal experts to call on the federal government to set national standards.
“Until we can better control the risks of using this technology, there should be a moratorium or a set of bans on how and where it can be used,” says Kristen Thomasen, a law professor at the University of British Columbia.
The patchwork of regulations governing new biometric technologies has led to situations where the data protection rights of some citizens are better protected than others.
“I think the fact that different police departments are taking different actions raises concerns about inequality and the way people are treated across the country, but [cela] also underscores the importance of some kind of federal action,” she stressed.
Facial recognition systems are a form of biometric technology that uses AI to identify people by comparing images or videos of their faces – often captured by security cameras – with existing images of them in databases. This technology in the hands of police is a controversial tool.
Practices vary across the country
In 2021, the Office of the Privacy Commissioner of Canada concluded that the Royal Canadian Mounted Police (RCMP) violated privacy laws by using the technology without public knowledge. That same year, Toronto police admitted that some of its officers had used facial recognition software without informing their chief.
In both cases, the technology was provided by the American company Clearview AI, whose database consisted of billions of images retrieved from the Internet without the consent of those whose images were used.
Last month, police in York and Peel, Ontario, announced they had begun implementing facial recognition technology from French multinational Idemia. In an interview, officer Kevin Nebrija said these tools “help speed up investigations and identify suspects sooner,” adding that in terms of privacy, “nothing has changed because the surveillance cameras are everywhere.”
But in Quebec, the director of the Service de Police de la Ville de Montréal (SPVM), Fady Dagher, says police will not introduce such biometric identification tools without a debate on issues ranging from human rights to privacy.
“This requires a lot of discussion before we think about implementation,” Mr Dagher assured in a recent interview.
Kevin Nebrija noted that the ministry had consulted with Ontario’s Privacy Commissioner on best practices, adding that the footage captured by police would be “legal,” meaning it would not be used by anyone.
Although York police insist officers will seek court approval, Kate Robertson, senior researcher at the University of Toronto’s Citizen Lab, says Canadian police forces have done just the opposite in the past.
Since revelations about the Toronto Police’s use of Clearview AI between 2019 and 2020, MMe Robertson said she was still unaware of “any police force in Canada having previously received approval from a judge to use facial recognition technology in their investigations.”
According to the latter, obtaining a green light from the court, usually in the form of an arrest warrant, represents “the gold standard for privacy protection in criminal investigations”. This ensures that the use of a facial recognition tool respects the Charter rights to freedom of expression, freedom of assembly and other rights.
A legal framework is required
Although the federal government does not have jurisdiction over provincial and municipal police forces, it can amend the Criminal Code to add legal requirements related to facial recognition software, just as it updated the law to account for voice recording technologies that could be used for surveillance purposes.
In 2022, Canada’s federal, provincial and territorial privacy advocates have called on lawmakers to create a legal framework for the appropriate use of facial recognition technology, including empowering independent regulators, prohibiting mass surveillance and limiting how long images can be kept in databases.
Meanwhile, the federal Department of Economic Development said Canadian law could potentially regulate the collection of personal information by companies under the Personal Information Protection and Electronic Documents Act.
“For example, if a police force, including the RCMP, were to outsource activities using personal information to a private company conducting commercial activities, then those activities could be regulated by the (Act), including services associated with facial recognition technologies,” the ministry said.
The Sûreté du Québec also has a contract with Idemia, but declined to comment on how it uses this technology.
In an emailed statement, police said their “automatic facial recognition system is not used to verify the identity of individuals.” “This tool is used for criminal investigations and is limited to the records of individuals who have been fingerprinted under the Identification of Criminals Act,” it said.
AI governance expert Ana Brandusescu says Ottawa and the country’s police forces have ignored calls for better governance, transparency and accountability in the procurement of facial recognition technology.
“Law enforcement does not listen to academics, civil society experts, people with life experience, people who are directly harmed,” she lamented.
To watch in video
Extreme problem solver. Professional web practitioner. Devoted pop culture enthusiast. Evil tv fan.