In Edmonton, Canada, police are testing body cameras enhanced with artificial intelligence that can recognize the faces of approximately 7,000 individuals on a “high-risk” watch list. This brings up a live experiment on whether facial recognition technology, viewed by many as overly invasive, could be embraced by law enforcement in North America.
However, this pilot project, which began last week, has raised significant concerns, not just within Edmonton—Canada’s northernmost city with over a million residents—but beyond. Six years ago, Axon Enterprises, a leading body camera manufacturer, recognized serious ethical issues with this technology and even stepped back from its use in 2019. The past chair of Axon’s AI Ethics Committee expressed unease about the company’s current direction, arguing that there hasn’t been enough public discourse or expert evaluation regarding the potential social and privacy risks involved.
Barry Friedman, a former board chair and current professor at New York University School of Law, emphasized the necessity of evaluating the tangible benefits against the risks associated with these technologies.
Despite this concern, Axon’s founder and CEO, Rick Smith, characterized the Edmonton initiative as merely an “early field study” rather than a product launch. Its purpose is to understand how the technology functions and to determine appropriate safeguards for its future use.
Smith noted that testing the technology outside the U.S. could yield independent insights that would reinforce oversight measures for future assessments, including those conducted domestically.
The body camera pilot intends to bolster officer safety by identifying individuals classified as “red flags”—those deemed potentially violent or dangerous, flight risks, or known high-risk offenders. Currently, the watch list reportedly contains 6,341 names, with 724 individuals having serious outstanding warrants, as highlighted by Acting Superintendent Kurt Martin during a December 2 news briefing.
Anne Lee Cook, Axon’s Director of Responsible AI, stated their focus is on identifying serious criminals through this technology.
If successful, this initiative could fundamentally change police operations worldwide. Axon, known best for its Taser devices, is also a key supplier of body cameras in the U.S. and is expanding its reach into Canada. Just last year, it secured a contract to provide cameras for the Royal Canadian Mounted Police, edging out its rival Motorola Solutions.
Motorola acknowledged it possesses the ability to integrate facial recognition into its body cameras but refrains from doing so for pre-identification purposes, citing ethical principles. However, they haven’t dismissed the idea of future use.
This coming year, the Alberta government is set to mandate all provincial police services, including in Edmonton, to install body cameras, aiming to enhance transparency in police interactions and improve evidence collection.
While many U.S. communities have welcomed body cameras as tools for accountability, the idea of using real-time facial recognition has encountered widespread political disapproval. Civil liberties advocates and ongoing discussions about racial justice have caused major tech companies, including Axon, to halt the sale of facial recognition systems to law enforcement.
Concerns have been raised about the accuracy of the technology, particularly regarding biases in results based on race, gender, and age. Additionally, real-time recognition has proven less effective than matching faces from ID cards or mugshots.
Several U.S. states and municipalities are looking to restrict the use of facial recognition by police, while the current administration under Donald Trump is attempting to prevent states from regulating AI technologies. In contrast, the European Union has outlawed real-time facial recognition in public spaces unless it’s linked to serious offenses like kidnapping or terrorism.
Meanwhile, in the UK—now outside the EU—authorities have been using this technology on London’s streets for a decade, with around 1,300 arrests made in the past two years. There’s talk about expanding its application nationwide.
Details about the Edmonton pilot remain somewhat under wraps. While Axon hasn’t developed its own facial recognition model, it hasn’t disclosed which third parties are providing this capability.
The Edmonton police plan to run the pilot through December, limiting tests to daylight hours. Superintendent Martin noted that considerations like winter darkness and other environmental factors are critical to evaluating the concept’s success.
The officers utilizing this technology won’t know if the facial recognition matches have occurred; data will be analyzed back at the station. However, there’s potential for future advancements to enable police to identify nearby threats early on. It’s important to note that officers will activate this technology only when responding to specific incidents, not while casually monitoring crowds.
Martin stressed the commitment to respecting individual rights and privacy throughout this process.
The Alberta Information and Privacy Commissioner confirmed receiving a privacy impact assessment from the Edmonton police on December 2. The agency is currently reviewing its requirements for projects involving sensitive personal data.
Temitope Oriola, a criminology professor at the University of Alberta, isn’t surprised by the live facial recognition experiments, citing the existing prevalence of this technology in security settings like airports.
“Edmonton is like a testing ground for this technology,” Oriola remarked, adding that while improvements are possible, uncertainty remains about its impact.
He pointed out the historically tense relationship that police have had with Indigenous and Black communities, particularly after a recent police shooting involving a member of the South Sudanese community. It’ll be interesting to see whether the new technology improves safety or community relations.
Axon has faced challenges with its tech rollouts previously; for instance, several committee members resigned in 2022 over ethical concerns regarding Taser-equipped drones.
Despite having distanced itself from facial recognition in the past, Axon has pursued controlled lab-based research that CEO Smith suggests has now matured and is ready for real-world evaluation. Yet, they acknowledge limitations in accuracy due to factors such as lighting, distance, and angle, especially affecting individuals with darker skin tones.
Axon’s protocols mandate that all matches go through human evaluations, and part of this pilot aims to understand the necessary training and supervision for those reviewers to minimize inherent risks.
Friedman urged Axon to share their findings, expressing a desire for further evidence that facial recognition technology has improved since the company deemed it unreliable for police usage. He also raised concerns about law enforcement pushing for this technology without sufficient public or scientific debate.
“This isn’t solely a decision for law enforcement agencies, nor just businesses,” Friedman noted. “While pilot programs can be beneficial, transparency and accountability are key. Unfortunately, those aspects seem to be lacking. It feels like things are moving too quickly.”





