Eleven civil rights and anti-racism organizations are urging Metropolitan Police Commissioner Mark Rowley to cancel plans for live facial recognition (LFR) at next weekend’s Notting Hill Carnival, citing racial bias in the technology and an ongoing legal challenge.
In a letter obtained by The Guardian, the groups warn that using instant face-scanning cameras at an event celebrating the African-Caribbean community “will only worsen concerns about police abuse of power and racial discrimination within your force.”
Signatories—including the Runnymede Trust, Liberty, Big Brother Watch, Race on the Agenda, and Human Rights Watch—argue that the technology is “less accurate for women and people of color.”
The demand follows recent government moves to expand facial recognition vans across nine police forces in England and Wales.
The Met previously announced it would place special cameras at entry and exit points of the two-day carnival in west London, which draws up to 2 million people annually—making it the world’s second-largest street festival.
The letter states: “Deploying LFR at Notting Hill Carnival unfairly targets the very community the event celebrates. The Met has already been found institutionally racist in Baroness Casey’s review, and discriminatory policing has severely damaged public trust. Using this technology here will only deepen concerns about racial bias and misuse of power.”
The letter also notes that since the Met’s announcement, anti-knife campaigner Shaun Thompson has launched a High Court challenge. Thompson, a Black British man, was wrongly flagged as a criminal by facial recognition, detained by police, and pressured to provide fingerprints.
“Mr. Thompson was returning from volunteering with Street Fathers, a youth anti-violence group, when officers surrounded him and held him for 30 minutes. He compares LFR’s discriminatory impact to ‘stop and search on steroids,'” the letter says.
Campaigners argue police have been allowed to “self-regulate” the technology, despite past settings that disproportionately misidentified Black individuals. A National Physical Laboratory report found the Met’s NeoFace system is less accurate for women and people of color at certain settings.
Dr. Tony Mansfield, the report’s author, acknowledged that at lower thresholds, the system “shows bias against Black males and females.” Police are not legally required to use higher, more accurate settings.
In 2018, an MIT Media Lab study found facial recognition from three major companies misidentified darker-skinned women 21-35% of the time, compared to less than 1% for light-skinned men.
Other signatories include leaders from Privacy Watch, Privacy International, Race Equality First, Open Rights Group, Access Now, StopWatch, and Statewatch.
The Met has stated the cameras will be placed near carnival boundaries—not inside—to help officers “identify and intercept” individuals posing public safety risks, such as those wanted for serious crimes like knife offenses or sexual assaults.
However, civil liberties groups were alarmed to learn Welsh police previously used the technology to target ticket scalpers.
Around 7,000 officers will be deployed at this year’s carnival.Police officers and staff will be on duty throughout the carnival, according to the Metropolitan Police. Live facial recognition (LFR) cameras near the event will scan for missing persons and individuals subject to sexual harm prevention orders. Screening arches will be set up at major entry points, where officers will use stop-and-search powers.
While the carnival remains community-led, senior officials have raised safety concerns, suggesting it should either move to Hyde Park or require tickets to prevent overcrowding in narrow streets.
Home Secretary Yvette Cooper recently announced plans for new regulations governing LFR use, stating: “Facial recognition will target sex offenders and those wanted for serious crimes who have evaded capture.”
The Met and South Wales Police have been trialing the technology. According to the Met, LFR has helped make 580 arrests for offenses including rape, domestic abuse, knife crime, assault, and robbery—52 of which involved registered sex offenders violating their restrictions.
Matt Ward, the Met’s Deputy Assistant Commissioner overseeing carnival policing, acknowledged lingering concerns about LFR in Black and minority ethnic communities:
“We’re committed to using technology responsibly to help officers work more effectively. That’s why we’ll deploy LFR cameras near—but not inside—the carnival area. This proven tool has facilitated over 1,000 arrests since January 2024.
“National Physical Laboratory tests confirm our system maintains accuracy across ethnicities and genders at current settings. However, we recognize misconceptions persist, particularly in communities of color, and we’re working to address these.”
FAQS
### **FAQs About Facial Recognition at Notting Hill Carnival & Racial Bias Concerns**
#### **Basic Questions**
**1. Why are campaigners against facial recognition at Notting Hill Carnival?**
Campaigners argue that the technology is racially biased, misidentifying people of color more often, leading to unfair targeting and wrongful stops by police.
**2. What is facial recognition technology?**
Facial recognition uses AI to scan and match faces in crowds against databases, often for security or policing purposes.
**3. How does racial bias affect facial recognition?**
Studies show the technology has higher error rates for darker-skinned individuals, increasing the risk of false arrests or harassment.
**4. Has facial recognition been used at Notting Hill Carnival before?**
Yes, police have tested it in past years, but critics say it unfairly targets Black attendees due to inaccuracies.
**5. What do supporters say about using facial recognition at the carnival?**
Supporters claim it helps prevent crime and identify offenders, but opponents say the risks of bias outweigh benefits.
—
#### **Advanced Questions**
**6. What evidence shows facial recognition is racially biased?**
Research found error rates up to 34% higher for darker-skinned women compared to lighter-skinned men.
**7. Are there laws regulating facial recognition at public events in the UK?**
Currently, UK police can use live facial recognition in public, but critics demand stricter rules to prevent misuse.
**8. Have there been wrongful arrests due to facial recognition?**
Yes—several cases, like a Black man wrongly detained in London, show the risks of false matches.
**9. What alternatives exist for security at Notting Hill Carnival?**
Options include community policing, better crowd management, and non-biased surveillance methods.
**10. How can people protest against facial recognition at the carnival?**
Campaigns like *Big Brother Watch* push for bans, while attendees can voice concerns to local officials and join protests.
**11. Do other countries restrict facial recognition at public events?**
Yes—some US cities and the EU are considering bans or strict limits due to privacy and bias concerns.
**12. Can facial recognition be improved to reduce bias?**
Experts say better training data and transparency could help, but many argue the risks remain too high for public