Perhaps ever since the Peasant Wars of 1525 – or at the very latest, since the Failed Revolution of 1918/19 – Germany’s ruling elite has known to keep a watchful eye on the more rebellious sections of the population.
The rise of capitalism and new technologies has only furthered the drive toward total surveillance – what we now call surveillance capitalism.
The latest chapter in the saga of spying on citizens and identifying potential delinquents – as defined by the state – comes in the form of a U.S. spy software called Palantir.
In 2008, the Palantir corporation introduced its tellingly named software: Gotham – a dark, corrupt, and crime-ridden metropolis familiar to Batman fans.
Palantir’s tool is a defense and intelligence platform. It links so-called “alerts,” geospatial analysis (read: we know where you are), and generates algorithmic predictions about your potential to violate laws – as interpreted by police, state institutions, and Palantir’s own algorithms.
Palantir Gotham has already been deployed for something euphemistically referred to as predictive policing. That means identifying you as a suspect, a wrongdoer, a delinquent, an “unruly element,” or simply a criminal before you even know that you’ve become one – at least according to the algorithm.
With Palantir’s assistance, and just to be sure, police can arrest you – or, on a good day, merely harass you –before you’ve done anything that would traditionally be considered illegal.
And all of this might be decided not by a judge, but by an authoritarian ruler like Orban, a religious demagogue like Modi, a neo-fascist like Meloni, or, worse still, a wannabe dictator like Trump. In the U.S., racism in police AI analytics is rarely far behind.
In other countries, additional – largely invented – discriminatory markers can be added to the mix. The idea is simple: blame the victim, and – this is crucial – don’t blame capitalism, mass poverty, unemployment, despair, homelessness, or any of the structural problems of society.
In other words, it’s not the structural violence of capitalism or its built-in brutalities that’s at fault. It’s the individual – to be monitored, profiled, and punished.
Because of its high potential for abuse, Germany’s Social Democratic Interior Minister halted federal use of Palantir in 2023. But in Germany’s federal system, this was largely symbolic. Policing is a state matter – not a federal one.
As of July 2025, police departments in three of Germany’s 16 states – Bavaria, Hesse, and the most populous, North Rhine-Westphalia – had already used Palantir, primarily for data mining.
Already in March 2025, the southern state of Baden-Württemberg signed a €25 million [$29 million] contract – despite lacking any legal framework for its use.
In Baden-Württemberg, the so-called “democratic” state operates like a black box – a multi-million-euro surveillance deal signed behind closed doors. After all, perhaps those being spied on shouldn’t know they’re being spied on.
Most recently, Baden-Württemberg – home to Bosch, Porsche, Mercedes-Benz, Würth, Hugo Boss, Kärcher, etc. – purchased the infamous Gotham software from Palantir.
Perhaps this was also motivated by a desire to protect corporate assets from striking workers as the region shifts toward “restructuring” (read: layoffs).
Once this secretive deal became public, the state government – led since 2011 by the supposedly environmentally conscious Greens under Winfried Kretschmann – began to struggle with a loss of public trust. Meanwhile, control has been handed to a U.S. software company, and attacks on fundamental rights may follow.
Spy cameras may be conspicuous – but at least they’re visible. Palantir’s software is not. It’s invisible, opaque, and functions behind closed doors, linking seemingly unrelated data points, building algorithmic profiles of citizens.
The deal with Palantir was marketed with the usual AI hyperbole. It was a classic case of “buy one, get one free”: five years of surveillance software for €25 million. The state took the bait without hesitation.
The Interior Ministry’s conservative secretary, Thomas Strobl (CDU), signed off – unilaterally, without a legal foundation, without democratic legitimacy, and without informing his coalition partners – the Greens.
That’s how Baden-Württemberg ended up with Palantir: a surveillance system usually reserved for the military and intelligence agencies, now installed in what’s supposed to be a democratic state.
The Greens, a party that claims to champion deliberative democracy, learned of the deal from a local newspaper. Green party state affairs expert Oliver Hildenbrand appropriately labeled it “The Palantir Disaster.”
So far, the police aren’t allowed to use Palantir – because a legal and democratic basis doesn’t exist. Yet the software is designed to process huge datasets and generate “actionable” insights for law enforcement.
It links data from various sources and visualizes relationships between people, locations, and events – changing how investigators retrieve and cross-compare information. In effect, the algorithm links what was never meant to be linked.
To legalize Palantir’s use, the state is now pushing a new, more invasive Police Act, framing it as a tool for “cross-procedural research and analysis,” code-named VeRA.
Pushing Palantir and VeRA, conservative hardliner Alexander Dobrindt received Germany’s 2025 Big Brother Award – the country’s “Oscar for Surveillance”.
The law is the product of a Green–CDU pact, made after a dispute. According to investigative reports by SWR TV, the deal was: 1,500 hectares of national parkland would be protected, in exchange for legalizing Palantir.
Without the new police law, the €25 million software remains dormant – and so conservatives need this political trade-off.
The Greens, in turn, secured the formation of a parliamentary oversight commission and a “commitment” to limit Palantir’s use in time and scope.
The five-year contract is still running. The new police law will define how Palantir Gotham may be used – all in a state with no serious crime problem.
Measured in crimes per 100,000 people, the 2024 German average was 7,000. Baden-Württemberg: 5,200. Berlin: 14,700. Hamburg: 12,100. NRW: 7,800. Saxony-Anhalt: 8,600.
In short: Baden-Württemberg is one of the safest places in Germany. Even the far-right terrorist NSU committed no killings there.
Still, the state is moving forward with plans to pass the police law in early 2026. After that, officers will gain full access to Palantir. By mid-2026, after more than a year of secret deals, the system will be ready for use – according to the Ministry of the Interior.
Authorities say testing will take time – no animal testing, of course. Palantir spies on people, not animals. Meanwhile, the software has been in use for years in Hessen, Bavaria, and North Rhine-Westphalia.
A story out of Hessen claims a terrorist attack was prevented thanks to Palantir – a bit like the Y2K myth: millions were spent to stop something that may never have existed.
In this case, seven years ago, Hessen police reportedly analyzed 150 folders’ worth of data, leading to a conviction. Officials claimed it would’ve been impossible without Palantir. But this raises three questions:
- Is this tale actually true – or was it fabricated to justify expensive spy software?
- Would a German official ever lie? (See: former Chancellor Kohl or Minister Friedrich Zimmermann – nicknamed “Old Schwurhand.”)
- Even if true, does the prevention of one terrorist act justify mass surveillance?
Whatever the case, Palantir promises that what once took days can now be done in seconds. But AI has been overpromising and under-delivering for decades.
For the Ministry, the message is clear: Palantir saves lives. But would $29 million for a local hospital save more? Never mind. The CDU hails Palantir as a miracle.
Critics see it as the beginning of automated mass surveillance. Palantir is more than a tool. It is also an ideology.
Dumping raw data into a black box until the “right” answer appears. Even with 99.9% accuracy, false flags are inevitable. Real people become collateral damage.
Palantir doesn’t just analyze suspects. It processes victims, witnesses – anyone. Even a bike theft report can land you in the database. From PayPal to the NSA, Palantir is everywhere.
And it was co-founded by a man who opposes democracy itself.
Peter Thiel – raised in a Nazi-glorifying South African community – is an extremist neoliberal and right-wing ideologue. A proud supporter of Trump, Thiel once said:
“I no longer believe that freedom and democracy are compatible.”
He’s also expressed disdain for women’s suffrage. Yet, Palantir began in 2003 to detect PayPal fraud. The CIA invested early via In-Q-Tel. It became a tool of U.S. intelligence. Today, Gotham is used worldwide. But its algorithm remains a black box.
No one outside Palantir knows exactly how data is linked or suspects are flagged. Even parliaments and data protection officers are denied access.
Back in Baden-Württemberg, the state claims no data leaks will occur – a fanciful promise. Palantir will be used “offline” – another assurance. The Fraunhofer Institute claims it found no backdoors. But what about future updates?
Palantir says the software has no hidden access points. Who wouldn’t trust a surveillance company to be honest? The truth is: Palantir is a mystery. Trusting its assurances – or Thiel’s – is naïve.
The installed version undergoes “black box testing” – but nobody knows what the next update will bring. A lot of promises, few guarantees.
Palantir is a system trained on the biases of its white male founders – the “white dude problem” in AI. It can easily produce politically skewed results.
And every euro spent on Palantir empowers its founder – an enemy of democracy.
Palantir’s dominance is also structural. European firms never built equivalents – because what Palantir does has largely been illegal here.
Now, European states risk falling into the lock-in trap– once you invest in a system, it’s hard to get out. Baden-Württemberg’s €27 million contract ensures dependency.
And so, both data – and people – become trapped inside Thiel’s spy software. Willingly or unwillingly. Knowingly or unknowingly.
ZNetwork is funded solely through the generosity of its readers.
Donate