
Palantir is known to work closely with the US secret services and defense establishment. The company’s main product Palantir Gotham is used by intelligence agencies and the army to analyze intelligence, prevent terrorist attacks and fight crime.
In this context, the attention of the British regulator to its products is not only of heightened interest, but also raises certain business concerns.
The experiment, designed for three months, costs more than £30 thousand per week and is aimed at analyzing huge amounts of data inside the regulator. More than 42 thousand financial companies are in the spotlight, as well as potential cases of money laundering, insider trading and fraud, according to Ai News.
Algorithms vs. data chaos
Today’s markets generate such volumes of information that traditional control methods can no longer cope. This is where AI platforms come into play: they are able to analyze unstructured data – from internal reports and customer complaints to phone calls, social networks and emails, says the publication.
The FCA is betting that machine learning will enable it to identify hidden patterns and pinpoint resources for investigations. According to industry experts, much of the valuable information within regulators has simply not been used before – and this is what opens up the space for a technological leap.
Notably, regulators have opted out of testing on synthetic data, choosing to work with real information. This approach increases risks, but gives a more accurate assessment of the efficiency of algorithms “in combat conditions”.
From finance to defense
The government’s interest in Palantir Technologies goes well beyond the financial sector. In 2025, London partnered with the company to apply AI to defense, from accelerating decision-making to improving the accuracy of military operations.
Palantir, in turn, plans to invest up to £1.5 billion to create a European defense hub in the UK capital. The project is expected to create hundreds of jobs and strengthen the country’s position as a center of technological security.
However, the introduction of private AI solutions into government processes raises questions about data protection. The FCA emphasizes that Palantir acts solely as a processor of information and does not have access to it outside of strict guidelines.
Encryption keys remain with the regulator, all data is stored internally and the use of information for commercial purposes is prohibited. After the pilot project is completed, the company is obliged to destroy all data received.
This approach reflects the new reality: states are willing to delegate data analysis to algorithms, but not control. And if the FCA’s experiment proves successful, it could be a turning point for the entire system of financial supervision in Europe.









