Palantir is rewriting security and privacy, creating massive database shared by federal agencies including ICE and the military
By Alexander J. Schorr
Image: CC via Bing
“Our Product is used to kill people.” — Palantir CEO Alex Karp, emphasizing the company’s role in enabling lethal military operations.
December 8, 2025 (San Diego) — Palantir Technologies is building the U.S. Army’s next-generation ground station, TITAN, using artificial intelligence (AI) to process massive amounts of sensory data to improve targeting accuracy. A March 2025 executive order by President Trump mandated data sharing across federal agencies, positioning Palantir as the primary vendor to build digital infrastructure for this effort— potentially creating a centralized database for American’s personal information. Palantir’s software has been central to ICE and the Department of Homeland Security (DHS) operations in tracking and identifying undocumented immigrants, which aligns with Trump’s immigration crackdown agenda.
The company’s AI tools provide "precision targeting” recommendations with little human oversight in some cases, deciding who or what gets targeted once a confidence threshold is met.
Critics argue this transforms warfare into a more calculated campaign with potential for mass civilian casualties, raising concerns over war crimes and a lack of human judgement.
Palantir develops the AI software that facilitates “kill chains” and targeting processes for military and intelligence agencies, and its technology has allegedly been used to guide drone strikes and other other military actions.As the Pentagon is leaning more and more on allowing Artificial Intelligence (AI) systems to target and kill humans on its own.
Although largely unnoticeable, technology such asPalantir’s plays a major role in world events, from wars in Iran, Gaze and Ukraine to the incarceration of immigrants and dissident students in the United States. Despite its more clandestine ubiquity, lawmakers, technologists, and the media have not covered comprehensively the potential power and threat of this particular type of weaponized AI and its consequences. The technology has garnered concern from human rights groups and activists about the ethical implications of using AI in lethal targeting, especially in regards to present war crimes and conflict zones involving Russia’s war on Ukraine, and the carnage in Gaza.
In essence, critics emphasize that while the technology itself may be a neutral tool, its immense potential, once in the wrong hands or without sufficient oversight, poses significant danger to civil liberties and democratic institutions.
Palantire Summarized
Palantir Technologies is an American publicly traded software company that specializes in big data analytics. The company was founded in 2003 and is headquartered in Denver, Colorado. The company was founded by Peter Thiel, Stephen Cohen, Joe Lonsdale, Alex Karp, and Nathan Gettings. Palantir is a Silicon Valley-based data firm which has been protested in the past for its work with US Immigration and Customs Enforcement (ICE).
It has two main platforms: Foundry for commercial clients and Gotham for government and defense agencies. Palantir Foundry is structured to help organizations integrate, manage, and analyze large datasets to improve operations and decision-making. Customers are companies involved in finance, healthcare, and manufacturing. Palantir Gotham is used by agencies such as the CIA, NSA, FBI, and various military branches. While some of Palantir’s Gotham software has been used by law enforcement, CEO Alex Karp and the company have denied providing predictive policing tools.
Palantir also utilizes the Artificial intelligence Platform (AIP). Launched in 2023, AIP integrates language models into networks to help both government and commercial clients and build AI-driven applications. Palantir has a software called Apollo, which is a software delivery system that manages and deploys Palantir’s other platforms across different environments, including cloud and on-premise systems..
The company Palantir borrows from the fictional item of the same name from J.R.R. Tolkien’s body of work “The Lord of the Rings,” in which the artifact is utilized to scry information and spy upon others. Palantir Technologies bears a striking similarity. It sells an AI-based platform that allows its users— with military and law enforcement agencies being among them— to analyze personal data, including social media profiles, personal information, and physical characteristics, and these are used to identify and monitor citizens.
Critics are worried about Palantir’s intentions
Palantir has faced controversy due to the sensitive nature of its work, which includes its contracts with government agencies like Immigration and Customs Enforcement (ICE) and its role in surveillance and data privacy concerns. This extends to its use of government contracts and the potential for surveillance through its software. The company maintains that it does not collect or sell its customers’ data, and that privacy and security features are built into its products.
Palantir’s work with the United Kingdom’s National Health Service has drawn controversy over patient data sharing, particularly in relation to the COVID-19 response and a newer data platform contract. Palantir has faced scrutiny over its use in military operations, including allegations that its advance targeting software has been used in Israel’s campaign in Gaza. Critics like Amnesty International, have raised concerns that the company is failing its responsibility to protect human rights when its software is being used to facilitate serious human rights violations.
An Internal U.S. army memo in late 2025 warned of “fundamental security” problems and vulnerabilities in a joint battlefield communications system made by Palantir and its partners. The memo stated that the system was at a “very high risk” from insider threats and external attacks, stating that any user could potentially access and misuse sensitive classified information without being tracked.
Palantir and the Army did say that the issues were identified and addressed through the normal development process.
Palantir has engaged in legal action against the U.S. Army on multiple occasions, successfully arguing that the Army’s procurement processes were unlawfully biased towards traditional defense contractors and excluded commercially available products. These legal battles highlighted a contentious relationship with standard government acquisition methods.
There have been concerns from even the political right about Palantir’s connections to the “deep state,” as well as its potential for mass surveillance enabled by its secretive government contracts. The U.S. Department of Labor filed an administrative lawsuit against Palantir in 2016, alleging that the company systematically discriminated against Asian job applicants in its hiring practices. The case was later settled.
Palantir’s Gotham software has been used by some police departments and accused of being a tool for “predictive policing,” a practice criticized for potentially perpetuating existing racial and socioeconomic biases in law enforcement. Palantir has disputed these claims. The company has faced scrutiny for exploring collaboration with countries with poor human rights records, such as Saudi Arabia, a move employees and critics viewed as a departure from its stated focus on western democratic values.
