AI Police: calls for regulation of facial recognition technology are growing

Some police services in Canada are using facial recognition technology to solve crimes, while other police services say human rights and privacy concerns prevent them from using these powerful digital tools.

It’s this uneven application of the technology — and the vague rules governing its use — that have led legal experts and AI experts to call on the federal government to establish national standards.

“Until there is better management of the risks associated with the use of this technology, there should be a moratorium or set of prohibitions on how and where it can be used,” says Kristen Thomasen, a law professor at the University of British Columbia. .

Furthermore, the patchwork of regulations surrounding new biometric technologies has created situations in which the privacy rights of some citizens are better protected than others.

“I think the fact that different police forces are taking different actions raises concerns about the disparities and how people are being treated across the country, but it also underscores the continued importance of some kind of federal action,” he said. She said.

Facial recognition systems are a form of biometric technology that uses artificial intelligence to identify people by comparing images or videos of their faces, often captured by security cameras, with existing images in databases. Technology is a controversial tool in the hands of police.

In 2021, the Office of the Privacy Commissioner of Canada concluded that the RCMP violated privacy laws by using the technology without public knowledge. The same year, Toronto police admitted that some of their officers used facial recognition software without informing their commanding officer. In both cases, the technology was provided by US company Clearview AI, whose database consisted of billions of images taken from the internet without the consent of those whose images were used.

Last month, police in York and Peel, Ontario, said they had begun implementing facial recognition technology provided by French multinational Idemia. In an interview, York Police Officer Kevin Nebria said the tools “help speed up investigations and identify suspects sooner,” adding that when it comes to privacy, “nothing has changed because security cameras are everywhere.”

But in neighbouring Quebec, Montreal Police Chief Fady Dagher says police won’t adopt such biometric identification tools without a debate on issues ranging from human rights to privacy.

“It will take a lot of discussion before we think about implementing it,” Dagher said in a recent interview.

Nebrija noted that the ministry has consulted with Ontario’s privacy commissioner on best practices, adding that the footage police capture will be “obtained legally,” either by working with security camera owners or by obtaining court orders for the footage.

And while York police insist officers will seek judicial authority, Kate Robertson, a senior researcher at the University of Toronto’s Citizen Lab, says Canadian police forces tend to do just the opposite.

Following revelations about Toronto police’s use of Clearview AI between 2019 and 2020, Robertson said he was “not yet aware of any police services in Canada that have received pre-approval from ‘a judge to use facial recognition technology in investigations’.

According to Robertson, getting the green light from the court, usually in the form of an arrest warrant, is “the gold standard for privacy in criminal investigations.” This ensures that a facial recognition tool, when used, is properly balanced against the right to freedom of expression, freedom of assembly and other rights enshrined in the Charter.

While the federal government does not have jurisdiction over provincial and municipal police forces, it can amend the Criminal Code to incorporate legal requirements for facial recognition software in the same way it has updated the law to address voice recording technologies that could be used for surveillance.

In 2022, the chairs of the federal, provincial and territorial privacy commissions called on legislators to establish a legal framework for the appropriate use of facial recognition technology, including strengthening independent oversight bodies, banning mass surveillance and limiting the retention period of images in databases.

Meanwhile, the federal Department of Economic Development said Canadian law could “potentially” regulate the collection of personal information by businesses under the Personal Information Protection and Electronic Documents Act, or PIPEDA.

“If, for example, a police force, including the Royal Canadian Mounted Police (RCMP), were to outsource activities using personal information to a private company conducting commercial activities, those activities could potentially be regulated by PIPEDA, including services related to facial recognition technologies,” the department said.

The Quebec Provincial Police also has a contract with Idemia, but does not explain exactly how it uses the company’s technology.

In an emailed statement, police said their “automated facial recognition system is not used to verify the identity of individuals.” The tool is used for criminal investigations and is limited to data cards of individuals whose fingerprints have been taken under the Criminal Identification Act.

Ana Brandusescu, an expert on AI governance, says Ottawa and the country’s police forces have failed to listen to calls for better governance, transparency and accountability in the purchase of facial recognition technology.

“Law enforcement is not listening to academics, to civil society experts, to people with lived experience, to people who have been directly harmed,” he said.


This report by The Canadian Press was first published June 30, 2024.

Chelsea Glisson

"Devoted reader. Thinker. Proud food specialist. Evil internet scholar. Bacon practitioner."

Leave a Reply

Your email address will not be published. Required fields are marked *