
Jun 18 (IPS) – CIVICUS discusses the hazards of dwell facial recognition know-how with Madeleine Stone, Senior Advocacy Officer at Large Brother Watch, a civil society organisation that campaigns in opposition to mass surveillance and for digital rights within the UK.

The fast growth of dwell facial recognition know-how throughout the UK raises pressing questions on civil liberties and democratic freedoms. The Metropolitan Police have begun completely putting in dwell facial recognition cameras in South London, whereas the federal government has launched a £20 million (approx. US$27 million) tender to increase its deployment nationwide. Civil society warns that this know-how presents critical dangers, together with privateness infringements, misidentification and performance creep. As authorities more and more use these methods at public gatherings and demonstrations, issues develop about their potential to limit civic freedoms.
How does facial recognition know-how work?
Facial recognition know-how analyses a picture of an individual’s face to create a biometric map by measuring distances between facial options, creating a novel sample as distinctive as a fingerprint. This biometric knowledge is transformed into code for matching in opposition to different facial photographs.
It has two essential purposes. One-to-one matching compares somebody’s face to a single picture – like an ID picture – to substantiate id. Extra regarding is one-to-many matching, the place facial knowledge is scanned in opposition to bigger databases. This manner is usually utilized by regulation enforcement, intelligence companies and personal corporations for surveillance.
How is it used within the UK?
The know-how operates in three distinct methods within the UK. Eight police forces in England and Wales at the moment deploy it, with many others contemplating adoption. In retail, retailers use it to scan prospects in opposition to inner watchlists.
Essentially the most controversial is dwell facial recognition – mass surveillance in actual time. Police use CCTV cameras with facial recognition software program to scan everybody passing by, mapping faces and immediately evaluating them to watchlists of wished folks for fast interception.
Retrospective facial recognition works otherwise, taking nonetheless photographs from crime scenes or social media and operating them in opposition to current police databases. This occurs behind closed doorways as a part of broader investigations.
And there’s a 3rd kind: operator-initiated recognition, the place officers use a cellphone app to take a photograph of somebody they’re chatting with on the road, which is checked in opposition to a police database of custody photographs in actual time. Whereas it doesn’t contain steady surveillance like dwell facial recognition, it’s nonetheless going down within the second and raises vital issues in regards to the police’s energy to carry out biometric id checks at will.
What makes dwell facial recognition significantly harmful?
It essentially violates democratic rules, as a result of it conducts mass id checks on everybody in actual time, no matter suspicion. That is the equal to police stopping each passerby to verify DNA or fingerprints. It provides police extraordinary energy to establish and observe folks with out information or consent.
The precept on the coronary heart of any free society is that suspicion ought to come earlier than surveillance, however this know-how utterly reverses this logic. As an alternative of investigating after affordable trigger, it treats everybody as a possible suspect, undermining privateness and eroding presumed innocence.
The menace to civic freedoms is extreme. Anonymity in crowds is central to protest, as a result of it makes you a part of a collective fairly than an remoted dissenter. Dwell facial recognition destroys this anonymity and creates a chilling impact: folks change into much less prone to protest figuring out they’ll be biometrically recognized and tracked.
Regardless of the United Nations warning in opposition to utilizing biometric surveillance at protests, UK police have deployed it at demonstrations in opposition to arms gala’s, environmental protests at Components One occasions and through King Charles’s coronation. Comparable techniques are being launched at Delight occasions in Hungary and have been used to trace folks attending opposition chief Alexei Navalny’s funeral in Russia. That these authoritarian strategies now seem within the UK, supposedly a rights-respecting democracy, is deeply regarding.
What about accuracy and bias?
The know-how is essentially discriminatory. Whereas algorithm particulars stay commercially confidential, impartial research present considerably decrease accuracy for ladies and folks of color as algorithms have largely been educated on white male faces. Regardless of enhancements lately, the efficiency of facial recognition algorithms stays worse for ladies of color.
This bias compounds current police discrimination. Unbiased studies have discovered that UK policing already displays systemic racist, misogynistic and homophobic biases. Black communities face disproportionate criminalisation, and biased know-how deepens these inequalities. Dwell facial recognition know-how can result in discriminatory outcomes even with a hypothetically completely correct algorithm. If police watchlists have been to disproportionately function folks of color, the system would repeatedly flag them, reinforcing over-policing patterns. This suggestions loop validates bias via the fixed surveillance of the identical communities.
Deployment places reveal focusing on patterns. London police use cellular items in poorer areas with increased populations of individuals of color. One of many earliest deployments was throughout Notting Hill Carnival, London’s largest celebration of Afro-Caribbean tradition – a call that raised critical focusing on issues.
Police claims of bettering reliability ignore this systemic context. With out confronting discrimination in policing, facial recognition reinforces the injustices it claims to deal with.
What authorized oversight exists?
None. With out a written structure, UK policing powers developed via widespread regulation. Police due to this fact argue that obscure widespread regulation powers to stop crime oversee their use of facial recognition, falsely claiming it enhances public security.
Parliamentary committees have expressed critical issues about this authorized vacuum. At the moment, every police power creates its personal guidelines, deciding deployment places, watchlist standards and safeguards. They even use completely different algorithms with various accuracy and bias ranges. For such intrusive know-how, this patchwork method is unacceptable.
A decade after police started trials started in 2015, successive governments have did not introduce regulation. The new Labour authorities is contemplating rules, however we don’t know whether or not this implies complete laws or mere codes of observe.
Our place is evident: this know-how shouldn’t be used in any respect. Nonetheless, if a authorities believes there’s a case for using this know-how in policing, there have to be major laws in place that specifies utilization parameters, safeguards and accountability mechanisms.
The distinction with Europe is stark. Whereas imperfect, the European Union’s (EU) AI Act introduces sturdy safeguards on facial recognition and distant biometric identification. The EU is miles forward of the UK. If the UK goes to legislate, it ought to take inspiration from the EU’s AI Act and guarantee prior judicial authorisation is required for using this know-how, solely these suspected of significant crimes are positioned on watchlists and it’s by no means used as proof in courtroom.
How are you responding?
Our technique combines parliamentary engagement, public advocacy and authorized motion.
Politically, we work throughout occasion strains. In 2023, we coordinated a cross-party assertion signed by 65 members of parliament (MPs) and backed by dozens of human rights teams, calling for a halt because of racial bias, authorized gaps and privateness threats.
On the bottom, we attend deployments in Cardiff and London to watch utilization and supply authorized assist to wrongly stopped folks. Actuality differs sharply from police claims. Over half these stopped aren’t wished for arrest. We’ve documented surprising circumstances: a pregnant girl pushed in opposition to a shopfront and arrested for allegedly lacking probation, and a schoolboy misidentified by the system. Essentially the most disturbing circumstances contain younger Black folks, demonstrating embedded racial bias and the hazards of trusting flawed know-how.
We’re additionally supporting a authorized problem submitted by Shaun Thompson, a volunteer youth employee wrongly flagged by this know-how. Law enforcement officials surrounded him and, though he defined the error, held him for half-hour and tried to take fingerprints when he couldn’t produce ID. Our director filmed the incident and is a co-claimant in a case in opposition to the Metropolitan Police, arguing that dwell facial recognition violates human rights regulation.
Public assist is essential. You possibly can observe us on-line, be a part of our supporters’ scheme or donate month-to-month. UK residents ought to write to MPs and the Policing Minister. Politicians want to listen to all of our voices, not simply these of police forces advocating for extra surveillance powers.
GET IN TOUCH
SEE ALSO
Comply with @IPSNewsUNBureau
Comply with IPS Information UN Bureau on Instagram
© Inter Press Service (2025) — All Rights Reserved. Unique supply: Inter Press Service