Rozpoznawanie twarzy przez policję w Wielkiej Brytanii: co musisz wiedzieć

cyberfeed.pl 3 miesięcy temu


UK police have been utilizing live facial recognition (LFR) technology for the best part of a decade, with the Met being the first force to deploy it at Notting Hill Carnival in 2016.

Since then, the usage of the biometric surveillance and recognition tool by the Met has ramped up considerably. While the first deployments were sparse, happening only all fewer months, they are now run-of-the-mill, with facial recognition-linked cameras regularly deployed to events and busy areas of London.

Similarly, while South Wales Police (SWP) – the only another force in England and Wales to have officially deployed the “live” version of facial designation – utilized the technology much more extensively than the Met during its first roll-outs through 2017, it is now deploying it with much more frequency.

From the police’s perspective, the main operational benefits of facial designation include the ability to find people they otherwise would not be able to (whether that be for safeguarding or apprehending offenders), and as a preventative measurement to deter criminal conduct.

Almost immediately, however, the technology proved controversial. Out of the gate, police facial designation was derided for having no firm legal basis, mediocre transparency and questionable accuracy (especially for women and those with darker skin tones), all while being rolled out with zero public or Parliamentary debate.

The Met’s choice to first deploy the technology at Carnival – the biggest Afro-Caribbean cultural event in Europe and the second-largest street carnival in the planet outside of Brazil – besides attracted criticisms of organization racism.

In the case of SWP, its usage of live facial designation against activists protesting an arms fair in Cardiff yet led to a legal challenge.

In August 2020, the Court of Appeal concluded that SWP’s usage of the tech up until that point had been unlawful, due to the fact that the force had failed to conduct an appropriate Data Protection Impact Assessment (DPIA) and comply with its Public Sector Equality work (PSED) to consider how its policies and practices could be discriminatory.

Although the court besides concluded that SWP had violated the privacy rights of the claimant, the judgement yet found the problem was with how the technology had been approached and deployed by police, alternatively than a peculiar problem with the technology itself.

In this essential guide, learn about how the police have been approaching the technology, the ongoing concerns around its proportionality, necessity and efficacy, and the direction of travel set for 2024 and beyond.

What is facial recognition?

While LFR has received the most public attention and scrutiny, another facial designation techniques have besides started gaining popularity among UK law enforcement.

With LFR, the technology fundamentally acts as a biometric police checkpoint, with a facial recognition-linked camera scanning public spaces and crowds to identify people in real time by matching their faces against a database of images compiled by police.

Otherwise known as a “watchlist”, these databases are primarily comprised of custody photos and can run into thousands of images for any given LFR deployment, but are deleted after each operation along with any facial images captured during.

The second method is retrospective facial recognition (RFR). While it works in a akin fashion to LFR by scanning faces and matching them against a watchlist, RFR can be applied to any already-captured images retroactively.

Unlike LFR, which is utilized overtly with specially equipped cameras atop a visibly marked police van, RFR usage is much more covert, and can be applied to footage or images behind closed doors without any public cognition the surveillance has taken place.

Critics are peculiarly afraid by the increasing use of this technology, due to the fact that the sheer amount of image and video-capturing devices in the modern planet – from phones and social media to smart doorbell cameras and CCTV – is creating an abundance of material that can be fed into the software.

There is besides concern about what its operation at scale means for human rights and privacy, as it smooths out the various points of friction that have traditionally been associated with conducting mass surveillance.

Looking at operator-initiated facial recognition (OIFR), the newest iteration of facial designation being rolled out for UK police, the technology works via an app on officers’ phones that allows them to automatically compare the photos they’ve taken out in the field with a predetermined watchlist.

While national plans to equip officers with OIFR tools were only announced by UK police chiefs in November 2023, South Wales, Gwent and Cheshire police are already conducting joint trials of the tech.

Why is facial designation so controversial?

A major question hanging over the police’s usage of facial designation is whether it is actually essential and proportionate in a democratic society, especially given the deficiency of public debate about its roll-out.

Before they can deploy any facial designation technology, UK police forces must guarantee their deployments are “authorised by law”, that the consequent interference with rights (such as the right to privacy) is undertaken for a legally “recognised” or “legitimate” aim, and that this interference is both essential and proportionate. This must be assessed for each individual deployment of the tech.

For example, the Met’s legal mandate paper – which sets out the complex patchwork of government that covers usage of the technology – says the “authorising officers request to decide the usage of LFR is essential and not just desirable to enable the MPS to accomplish its legitimate aim”.

Responding to questions about how the force decided each individual deployment was both essential and proportionate, the Met has given the same answer to Computer Weekly on multiple occasions.

“The deployment was authorised on the basis of an intelligence case and operational necessity to deploy, in line with the Met’s LFR documents,” it said, adding in each case that “the proportionality of this deployment was assessed giving due respect to the intelligence case and operational necessity to deploy, whilst weighing up the impact on those added to the watchlist and those who could be expected to pass the LFR system”.

However, critics have questioned whether scanning tens of thousands of faces all time LFR is utilized is both a essential and proportionate measure, peculiarly erstwhile other, little intrusive methods are already available to police.

While there are a number of legally recognised purposes (such as national security, prevention of disorder or public safety) that state authorities can usage to intrude on people’s rights, proportionality and necessity tests are already well established in case law, and be to guarantee these authorities do not unduly interfere.

“In the case of police, they’re going to say ‘it’s prevention of disorder or crime, or public safety’, so they get past first base, but then 1 of the questions is, ‘is this essential in a democratic society?’” said Karen Yeung, an interdisciplinary professorial fellow in law, ethics and informatics at Birmingham Law School.

“There’s a very rich case law about what that means, but the core test is you can’t usage a hammer to crack a nut. Even though a machete might be perfectly good for achieving your task, if a pen knife will do, then you can only usage the pen knife, and the usage of a machete is unlawful due to the fact that it’s disproportionate … the basic way of explaining it is that it has to go no further than essential to accomplish the specified goal.”

In the case of RFR, while it has its own separate legal mandate document, there are similarities in the request to establish the intent and grounds of all search made with the software, as well as the proportionality and necessity of doing so in each case.

There is presently no legal mandate published for OIFR tools, but police chiefs have said this version of the tech won’t be rolled out to forces until sometime in 2024.

Is facial designation biased or discriminatory?

Closely linked with necessity and proportionality, there is besides the question of who the cameras are yet aimed at and why. This in turn brings up questions about bias and discrimination, which from the police and government position can be solved via improved algorithmic accuracy.

When LFR first began being deployed by UK police, 1 of the major concerns was its inability to accurately identify women and people with darker skin tones, which led to a number of people being wrongly identified over its first fewer years of deployment.

However, as the accuracy of the algorithms in usage by UK police has improved, the concerns have shifted distant from questions of algorithmic bias towards deeper questions of structural bias in policing, and how that bias is reflected in its technology practices.

Civil society groups maintain, for example, that the technology is “discriminatory and oppressive” given repeated findings of organization racism and sexism in the police, and that it will only further entrench pre-existing patterns of discrimination.

Others have argued the point further, saying that accuracy is simply a red herring. Yeung, for example, has argued that even if LFR technology gets to the point where it is able to identify faces with 100% accuracy 100% of the time, “it would inactive be a seriously dangerous tool in the hands of the state”, due to the fact that “it’s almost inevitable” that it would proceed to entrench existing power discrepancies and criminal justice outcomes within society.

How do facial designation watchlists work?

Watchlists are fundamentally images of people’s faces that facial designation software uses to find whether individual passing the camera is simply a match. While images can come from a scope of sources, most are drawn from custody images stored in the Police National Database (PND).

Given the well-documented disproportionality in policing outcomes across different social groups in the UK, the concern is that – in utilizing historical arrest data and custody images to direct where facial designation should be deployed and who it’s looking for respectively – people from certain demographics or backgrounds then end up populating the watchlists.

“If you think about the disproportionality in halt and search, the numbers of black and brown people, young people, that are being stopped, searched and arrested, then that starts to be truly worrying due to the fact that you start to get disproportionality built into your watchlists,” London Assembly associate and chair of the police committee, Caroline Russell, previously told Computer Weekly.

Further, in their appearances before a Lords committee in December 2023, elder officers from the Met and SWP confirmed to the Lords that both forces usage generic “crime categories” to find targets for their LFR deployments.

This means watchlists are selected based on the crime kind categories linked to images of people’s faces (which are mostly custody images), alternatively than based on intelligence about circumstantial individuals that are deemed a threat.

Another issue with the watchlists is the fact that millions of these custody images are held there completely unlawfully, meaning people never convicted of a crime could possibly be included.

In 2012, a advanced Court ruling found that its retention of custody images was unlawful due to the fact that unconvicted people’s information was being kept in the same way as those who were yet convicted. It besides deemed the minimum six-year retention period to be disproportionate.

While the Met’s LFR Data Protection Impact Assessment (DPIA) says that “all images submitted for inclusion on a watchlist must be lawfully held by the MPS”, millions of custody images are inactive being unlawfully retained.

Writing to another chief constables to outline any of the issues around custody image retention in February 2022, the NPCC lead for records management, Lee Freeman, said the possibly unlawful retention of an estimated 19 million images “poses a crucial hazard in terms of possible litigation, police legitimacy and wider support and challenge in our usage of these images for technologies specified as facial recognition”.

In November 2023, the NPCC confirmed to Computer Weekly that it has launched a programme that (while inactive not yet publicised) will search to establish a management government for custody images, alongside a review of all presently held data by police forces in the UK. This will be implemented over a two-year period.

Is facial designation effective?

Outside of these issues, there are open questions about the effectiveness of facial designation in policing.

Speaking with Computer Weekly, the erstwhile biometrics and surveillance camera commissioner, Fraser Sampson, for example, questioned the ways facial designation has been deployed by police, noting the thinness of the evidential basis around its effectiveness in tackling serious crimes.

He said that on the 1 hand, there are arguments from critics that UK police “never truly seem to catch anyone crucial utilizing it, let alone very dangerous or high-harm offenders”, but on the other, those in policing will argue this is due to the fact that it has been deployed so infrequently on comparatively so fewer people, “we’re not going to have very spectacular results, so therefore, we’ve got to usage it more to prove the case more”.

Given the Home Office’s repeated claim that LFR is simply a valuable crime prevention tool capable of stopping terrorists, rapists and another violent offenders, others have besides questioned its effectiveness for this stated intent given the majority of arrests made are for another offences, specified as drug possession, not appearing in court or traffic violations.

Sampson has said the overt nature of the deployments – whereby police forces are required to publically state erstwhile and where they are utilizing it – can besides hinder effectiveness, due to the fact that it means wanted people will simply avoid the area.

He added that the argument then becomes about making the capability more covert to avoid this pitfall: “Then it becomes very sinister … you can’t just avoid 1 town, due to the fact that it could be looking for you anywhere. The usage case has made itself on that argument.”

Sampson further challenged the technology’s crime prevention capabilities on the basis that authorities are largely relying on its chilling effect, alternatively than its actual effectiveness in identifying wanted individuals. He said the logic here is that people “might behave” if they know the police have a certain capability and might be utilizing it.

“It’s truly challenging for the police then to find the evidence that it can work erstwhile utilized properly, without having to throw distant all the safeguards to prove it, due to the fact that erstwhile they’re gone, they’re gone,” said Sampson.

Is facial designation legal?

There is no dedicated government in the UK to manage the police usage of facial designation technologies.

According to the Met Police’s legal mandate for LFR, the tech is regulated by a patchwork of the Police and Criminal Evidence Act (PACE) 1984; the Data Protection Act 2018; the Protection of Freedoms Act 2012; the Equality Act 2010; the Investigatory Powers Act 2000; the Human Right Act 1999; and common law powers to prevent and detect crime.

“These sources of law combine to supply a multi-layered legal structure to use, regulate and oversee the usage of LFR by law enforcement bodies,” it says.

While the mandate besides specifically references the Surveillance Camera Code of Practice as 1 of the “secondary legislative instruments” in place to regulate LFR use, the code is set to be abolished without replacement under the UK government’s data reforms.

Both Parliament and civilian society have repeatedly called for fresh legal frameworks to govern law enforcement’s usage of biometrics – including an official inquiry into police usage of advanced algorithmic technologies by the Lords Justice and Home Affairs Committee (JHAC); 2 of the UK’s erstwhile biometrics commissioners, Paul Wiles and Fraser Sampson; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons discipline and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

During his time in office before resigning in October 2023, Sampson besides highlighted a deficiency of clarity about the scale and degree of public space surveillance, as well as concerns around the general “culture of retention” in UK policing around biometric data.

Throughout the JHAC enquiry – which described the police usage of algorithmic technologies as a “new chaotic West” characterised by a deficiency of strategy, accountability and transparency from the top down – Lords heard from expert witnesses that UK police are introducing fresh technologies with very small scrutiny or training, continuing to deploy them without clear evidence about their efficacy or impacts, and have conflicting interests with their own tech suppliers.

In a short follow-up inquiry, this time looking exclusively at facial recognition, the JHAC found that police are expanding their usage of LFR without appropriate scrutiny or accountability, despite lacking a clear legal basis for their deployments. The committee besides specifically called into question whether LFR is even legal.

The committee added that, looking to the future, there is simply a real anticipation of networked facial designation cameras capable of trawling full regions of the UK being introduced, and that there is nothing in place to regulate for this possible development.

Despite myriad calls for a fresh legislative framework from different quarters, government ministers have claimed on multiple occasions that there is simply a sound legal basis for LFR in the UK, and that “a comprehensive network of checks and balances” already exists.

What are police doing next with facial recognition?

Despite open questions about the legality of police facial designation tools, the UK government has not been deterred from pushing for much wider adoption of the technology.

In November 2023, for example, the National Police Chief Council’s (NPCC) chair Gary Stephens noted it would play a “significant role” in helping UK policing become “an effective science-led service”.

In May 2023, an interim study into upcoming UK government data reforms revealed that policing minister Chris Philp was pushing for facial designation technology to be rolled out by police forces across England and Wales, and will likely push to integrate the tech with police body-worn video cameras.

He later wrote to police chiefs in October 2023 setting out the importance of harnessing fresh technologies for policing, urging them to double the amount of RFR searches they are conducting and deploy LFR much more widely.

At the start of the same month, Philp, speaking at a fringe event of the Conservative organization Conference, outlined his plans to integrate data from the PND, the Passport Office and another national databases with facial designation technology to aid catch shoplifters and another criminals.

The plan was met with criticism from campaigners, academics and Scottish biometrics commissioner Brian Plastow, who said the “egregious proposal” to link the UK’s passport database with facial designation systems is “unethical and possibly unlawful”.

Going forward, there are major concerns about what the UK government’s proposed data reforms mean for police technologies like facial recognition.

Some have argued, for example, that the forthcoming Data Protection and Digital Information Bill will weaken oversight of the police’s intrusive surveillance capabilities if enacted as is, due to the fact that it would abolish the surveillance camera code of practice and collapse facial designation into a specified data protection issue under the purview of the ICO.



Source link

Idź do oryginalnego materiału