The Metropolitan Police are using artificial intelligence tools from Palantir to identify potential officer misconduct.

The Metropolitan Police are using artificial intelligence tools from Palantir to identify potential officer misconduct.

The Guardian has learned that Scotland Yard is using artificial intelligence tools from the U.S. company Palantir to monitor officer behavior in an effort to identify problematic staff. The Metropolitan Police had previously refused to confirm or deny its use of technology from Palantir, which also works with the Israeli military and Donald Trump’s ICE operation. It has now acknowledged using Palantir’s AI to analyze internal data on sickness, absences, and overtime patterns to spot potential lapses in professional standards.

The Police Federation, representing rank-and-file officers, criticized the approach as “automated suspicion.” It warned that officers should not be subject to opaque or untested tools that might misinterpret heavy workloads, sickness, or overtime as signs of misconduct.

With 46,000 officers and staff, the Met is the UK’s largest police force and has faced numerous controversies, including failures in vetting—highlighted by Wayne Couzens’ murder of Sarah Everard—and tolerance of discriminatory and misogynistic behavior.

The force stated that evidence suggests a link between high levels of sickness, increased absences, or unusually high overtime and failures in standards, culture, and behavior. The time-limited pilot of Palantir’s technology aims to combine data from existing internal databases to help identify such patterns, as part of broader efforts to improve standards and culture. It emphasized that while Palantir’s systems help spot patterns, it is officers who investigate further and make any determinations on standards or performance.

A Police Federation spokesperson said any system profiling officers with algorithms must be treated with extreme caution, noting that policing already operates under intense scrutiny. The focus, they argued, should remain on proper supervision, fair processes, and human judgment, rather than automating suspicion.

Palantir has also been drawn into the controversy over Peter Mandelson’s former role as Keir Starmer’s ambassador to the U.S., before he was dismissed over his links to Jeffrey Epstein. A lobbying firm co-owned by Mandelson, Global Counsel, works for Palantir, which was co-founded by Trump-supporting billionaire Peter Thiel. Mandelson and Starmer visited Palantir’s Washington showroom last year and met its CEO, Alex Karp, shortly after Mandelson’s appointment.

MPs have called for greater transparency over Palantir’s UK public sector contracts, including a £330 million deal with the NHS in November 2023 for a data platform and a £240 million contract with the Ministry of Defence agreed in December 2025.

Responding to the Met’s pilot, Liberal Democrat MP Martin Wrigley, a member of the Commons science committee, expressed concern about officers’ rights as employees, noting that workplace surveillance has long been controversial. He questioned who is overseeing Palantir, given its expanding role in government.

Palantir’s AI is already used by several other police forces to assist investigations through regional units. Last month, Labour’s policing white paper committed to supporting the responsible adoption of AI at pace and scale, planning to invest over £115 million over three years to develop and roll out AI tools across all 43 police forces in England and Wales.

A Palantir spokesperson declined to comment.We are proud that our software helps improve public services across the UK. This includes enhancing police operations, supporting the NHS in performing more surgeries, and enabling Royal Navy ships to spend more time at sea.

Frequently Asked Questions
FAQs The Met Police Palantir AI for Identifying Officer Misconduct

BeginnerLevel Questions

1 What is this story about
The Metropolitan Police is using artificial intelligence software developed by the data analytics company Palantir to analyze internal data and flag patterns that might indicate potential misconduct by officers

2 What is Palantir
Palantir is a USbased technology company that specializes in big data analytics Its software is designed to find patterns links and insights within massive and complex datasets

3 Why is the Met Police using AI for this
The goal is to proactively identify concerning behaviorlike repeated complaints associations with highrisk individuals or policy violationsthat might otherwise go unnoticed in separate records Its part of an effort to rebuild public trust by holding officers accountable

4 What kind of data is the AI looking at
The system analyzes various internal data sources which can include records of complaints useofforce incidents attendance financial disclosures and other internal reports

5 Is this AI making decisions about officers
No The AI is a tool for flagging and analysis It highlights potential risks or patterns for human reviewers to then investigate further A human makes the final decision on any disciplinary action

Advanced ConcernBased Questions

6 What are the main benefits of using this system
Proactive Detection Can spot subtle patterns of misconduct that develop over time which manual reviews might miss
Efficiency Helps prioritize cases by analyzing vast amounts of data much faster than humans alone
Consistency Applies the same analytical criteria across the entire force reducing subjective bias in initial flagging
Deterrent The knowledge that such monitoring exists may deter some misconduct

7 What are the biggest concerns or risks
Garbage In Garbage Out If the underlying data is biased or inaccurate the AIs flags will be too potentially unfairly targeting officers
Privacy Surveillance Concerns about the extent of data collection on officers and the creation of a pervasive surveillance workplace