
A van being used by the metropolitan police as part of their facial recognition operation is pictured close to the route of the 'King's Procession', a two-kilometre stretch from Buckingham Palace to Westminster Abbey, in central London, on May 6, 2023. — AFP
#faces #criticism #mass #facialrecognition #rollout
Out of the supermarkets or in a festive crowd, millions of people are now scanning their features through real -time facial identification systems in the UK. It is the only European country that has largely deployed this technology.
In London’s noting Hill Carnival, where two million people are expected to celebrate African-Caribbean culture on Sunday and Monday, facial identification cameras are being deployed near entry paths and exit.
Police said their purpose was to identify and prevent the desired persons by scanning faces in large crowds and comparing thousands of suspects in the police database.
The technology is “an effective tool for policing that has already been used successfully to find criminals at the hotspot of crime, which has resulted in more than a thousand arrests since the beginning of 2024,” said Metropolitan Police Chief Mark Ravi.
This technology was first tested in 2016 and its use has increased significantly in the UK over the past three years.
According to NGO Liberty, about 4.7.7 million faces were scanned in just 2024.
British police have deployed a direct face identification system 100 times since late January, compared to only 10 between 2016 and 2019.
‘Nation of suspects’
Examples include two six countries before the rugby games and outside the two oasis concert in Cardiff in July.
When a police “watchlist” goes near the cameras, the AI -powered system is often set up in a police van, and causes warnings.
The suspect can be detained immediately when the police scrutiny is confirmed.
The Big Brother Watch Organization said, but on the streets of London occupy such large -scale data, which was also seen during the crowning of King Charles III in 2023, “treats us as a nation of suspects”.
“There is no legislative basis here, so we have no security measures to protect our rights, and the police have been released to write their rules,” Rebecca Vincent told AFP.
He added that its private use by supermarkets and clothing stores to deal with the rapid increase in shoplifting has also raised concerns, which are “very little information” available about the data being used.
Most use, a service provider who compiles a list of suspected criminals in stores in which he monitors and raises a warning if one of them enters the premises.
“What is it to live in a city, because it eliminates the possibility of being anonymous,” said Darg Murray, a human rights law lecturer at Queen Mary University in London.
He added, “This can lead to really major implications to protest, but also participate in political and cultural life.”
Often, those who use such stores do not know that they are being profile.
“They should inform people about it,” Abigil Beaven, 26, told AFP using a fee watch on the entrance of a London store.
She said she was “very surprised” to know how this technology was being used.
Recognizing that it could be useful to the police, he complained that his deployment by retailers was “unpleasant”.
Is banned in the European Union
Since February, European Union legislation, which rules artificial intelligence, has banned the use of actual facial identification technologies, with exceptions such as counter -terrorism.
Emphasizing the Vincent, in addition to some issues in the United States, “we do not see anything close in European countries or other democracies”.
Interior Minister Yuvati Cooper recently promised that a “legal framework” would be drafted on its use, which would focus on “very serious crimes”.
But its ministry this month gives police forces the authority to use the technology in seven new areas.
Usually kept in vans, next month, the permanent cameras will be installed for the first time in the southern London city of Cruden.
The police assure that they have “strong security measures”, such as disabling the camera when officers are not present and delete biometric data of people who are not suspected.
However, the UK’s human rights regulator said Wednesday that the Metropolitan Police’s policy on the use of technology was “illegal” because it was “not compatible with the rules of rights.”
Eleven organizations, including Human Rights Watch, wrote a letter to the Metropolitan Police Chief, emphasizing that they should not use it during the noting hill carnival, and accused them of “unfairly targeting the Afro-Caribbean community, highlighting AI’s racial prejudice.”
Shan Thomson, a 39 -year -old black man from London, said one of them was arrested by a camera after being misconduct as a culprit and filed an appeal against the police.