Out Loud with Ahmed Eldin

Out Loud with Ahmed Eldin

Share this post

Out Loud with Ahmed Eldin
Out Loud with Ahmed Eldin
Palantir: Financed by Epstein, Fueled by Thiel, Killing in Gaza and Spying on You

Palantir: Financed by Epstein, Fueled by Thiel, Killing in Gaza and Spying on You

Palantir took Epstein’s money, Thiel’s backing, and Trump’s blessing to build an AI war machine. If Gaza is the testing ground, America’s next. Are we ready to fight back?

Ahmed Eldin's avatar
Ahmed Eldin
Jun 06, 2025
∙ Paid
82

Share this post

Out Loud with Ahmed Eldin
Out Loud with Ahmed Eldin
Palantir: Financed by Epstein, Fueled by Thiel, Killing in Gaza and Spying on You
1
36
Share
Trump, Musk and now-Vice President Vance attend the 125th Army-Navy football game on Dec. 14, 2024, in Landover, Md.

Watching Musk and Trump cozy relationship collapse into a public meltdown, I can’t help but notice who’s quietly smiling: Trump’s bedfellow, Palantir, a data behemoth born in CIA coffers, now peddling itself to Israel’s war machine and carrying the taint of Epstein’s filthy millions.

Palantir, which helped draw up “kill lists” for Gaza—lists responsible for the murder of tens of thousands of Palestinians (the vast majority of whom are women and children) was bankrolled in part by convicted sex offender Jeffrey Epstein. Today, that same code-driven engine is giving Trump the ability to assemble a mega–AI database of every single American’s secrets.

Photograph of full page ad purchased by Palantir in The New York Times.

Built to extract insights from torrents of data with machine learning and AI, Palantir is optimizing manufacturing and supply chains but also optimizing genocide and kill chains. The big data giant was born out of a desire to do what was illegal. In 2003, the CIA couldn’t legally spy the way it wanted to. So it outsourced the job. The CIA became Palantir’s first and only investor for years, quietly bankrolling a private surveillance empire it couldn’t build itself.

A decade later, Edward Snowden exposed the truth: Palantir had been helping the U.S. government conduct mass, illegal spying—not on foreign threats, but on its own citizens. Now, this same company—with roots in secrecy, war, and surveillance—is partnering with Israel, powering a genocidal war in Gaza. What started as illegal surveillance has evolved into algorithmic warfare. And Palestine is its most brutal testing ground.

Today Palantir’s algorithms are built to kill Palestinian civilians in Gaza. These aren’t random horrors of war—they are the direct result of lines of code designed to label Palestinian lives as “targets,”, reducing human beings to data points destined for obliteration. Palantir doesn’t hide what it is—it brags about it. Its CEO, Alex Karp, proudly calls the company’s surveillance infrastructure a “digital kill chain”.

Upgrade To Paid Subscriber

Before Gaza’s skies were lit with bomb blasts, Palantir’s story was already woven into the darkest corners of power. In 2005, Jeffrey Epstein funneled $40 million into Peter Thiel’s pocket, seeding the very startup that would become Palantir—and Thiel has never fully disavowed that relationship, even as he insisted Epstein “had absolutely nothing to do with Israel.”

Courtesy of Business Insider. Click for article.

Thiel, who has donated over $1.25 million to Donald Trump’s 2016 campaign and $15 million to J.D. Vance’s Senate bid, stands at the nexus of a new authoritarian regime. ensuring that the company’s trajectory aligns with an “America First” vision prioritizing law-and-order enforcement over constitutional safeguards. In March 2025, Palantir secured a $30 million contract with ICE to “monitor undocumented immigrants,” a deal rights advocates warn will expand warrantless surveillance and racial profiling across border communities.

In 2020, both the U.S. and U.K. turned to its platform to coordinate vaccine rollouts—proof that once you build the data infrastructure, it can be repurposed for almost anything. Alex Karp’s unapologetic cheerleading for Western militarism, his snide takes on “wokeness,” and his dismissals of Silicon Valley distractions have charmed a growing chorus of hawks and technocrats.

Meanwhile, the Pentagon has been massively scaling up its investment in artificial intelligence. In a move that underscores just how central AI is becoming to U.S. military strategy, the Defense Department quietly expanded its contract with Palantir, boosting the ceiling for the company’s Maven Smart System to nearly $1.3 billion through 2029.

Originally launched in 2017, Project Maven was pitched as a way to bring cutting-edge tech into military operations. At its core, the system leverages AI to comb through vast streams of surveillance data — from satellites, drones, and other sensors — rapidly identifying and tracking objects of interest in real time.

Palantir snapped up that contract only after Google, panicked by employee walkouts, washed its hands of the project back in 2018. Today, Maven is already driving surveillance and targeting over Ukraine and accelerating U.S. strikes on Houthi rocket launchers around the Red Sea. Tasks that once demanded 2,000 analysts in 2003 now require a skeleton crew of just 20—because AI does in seconds what used to take many people working for many days to do.

Screenshot from Forbes Israel article.

As Palestinians continue bleeding under AI-driven strikes, the shadow of that same technology looms over the United States: the algorithms that tag a Palestinian child as an “enemy combatant” can just as readily mark Black Americans, immigrants, or dissidents here at home as “threats.” Civil rights lawyers have long warned of creeping state surveillance; today, Palantir stands poised to weaponize every single private datum: your voting history, your social media posts, even your grocery receipts … transforming America into a testing ground for the next wave of algorithmic repression

Palantir’s Role in Gaza

In January 2024, Palantir signed a multi-hundred-million-dollar “strategic partnership” with Israel’s Ministry of Defense to “harness Palantir’s advanced technology in support of war-related missions,” a deal that human rights experts immediately warned would turn AI into a kill-chain enabler. Two AI tools lie at the core of this collaboration, each transforming Gaza into a testing ground for automated violence:

  1. Lavender: An “AI-powered database” that sifts through phone metadata, social media activity, and movement patterns to assign Palestinians a “threat score” from 1 to 100. Investigations by +972 Magazine and Local Call revealed that, early in the Gaza offensive, over 37,000 Palestinians were flagged as potential “militants,” despite Lavender operating with at least a 10% margin of error. Under standard IDF protocols, an algorithmic “high score” is often enough to authorize deadly strikes. As one intelligence officer admitted, “The machine did it coldly,” leaving children and civilians to pay the price—and shielding individual soldiers from prosecution.

  2. Habsora (“The Gospel”): Developed in 2021 to solve Israel’s “target scarcity” problem, Habsora automatically identifies up to 100 new targets per day by ingesting satellite imagery, intercepted communications, and other surveillance layers. While Israeli spokespeople claim human analysts vet each recommendation, reports by TIME Magazine show that analysts often spend mere seconds confirming a target’s gender—and then approve strikes despite the system’s 10% error rate, which can result in families or entire neighborhoods being bombed without ever encountering an actual combatant.

These systems have generated tens of thousands of “kill lists,” feeding Israeli warplanes with coordinates of homes, schools, and mosques to be bombed while soldiers hide behind “machine error” as legal cover.

The Israeli military is using AI technology developed Palantir to choose who lives and who dies in Gaza—part of what can only be described as a mass destruction campaign. As former HUD official and investment banker Catherine Austin Fitts explains, the system—called Lavender—was designed not just to identify targets, but to shield the Israeli chain of command from legal responsibility for war crimes.

“After October 7, once the Israeli military entered Gaza, they were using Lavender to bomb.” The idea is simple, and chilling: if software makes the decision, no human can be held liable for violating international law.

She explained how IDF commanders “outsource moral responsibility” to the software. How they are turning mass killing into an assembly line where Palestinian blood becomes the lubricant.

“They structured the software to fully protect individual officers from legal accountability,” Fitts continues. “And, presumably, from moral accountability, too. It makes the process of genociding children—if not easier—at least more palatable.”

In a striking moment on May 8, 2024, at a debate in Cambridge, Peter Thiel appeared visibly rattled when confronted about Palantir’s role in enabling this kind of algorithmic genocide. “Someone in Cambridge asked Thiel about Lavender,” Fitts notes, “and he totally melted down—because maybe he’s legally insulated, but he’s starting to realize: in the court of public opinion, the reckoning may yet come.”

"I believe that, broadly, the IDF gets to decide what it wants to do and that they're broadly in the right,” Thiel said plainly.

Rising Resistance

Despite the terror these systems have unleashed, Palestinians and their allies are refusing to be silent. At the Hill and Valley Forum in Washington, D.C., earlier this spring, a Palestinian-American activist named Sumer Mobarak stormed the stage during Palantir CEO Alex Karp’s keynote, shouting, “You are getting wealthy off killing Palestinians. You are killing my family in Palestine.”

Videos of her courageous interruption quickly went viral, drawing hundreds of thousands of views on TikTok and sparking coverage across major outlets. As Karp stumbled to respond, calling her “a product of Hamas” and insisting his technology only kills “terrorists,” with the audience’s stunned silence spoke volumes.

Sumer’s act of defiance reveals that even in the belly of Washington’s power, people are waking up to the stakes. Yet, as powerful as her gesture was, it also underscored a grim reality: the systems enabling genocide and mass surveillance have already taken hold. Our challenge is whether we can still dismantle them before they become irreversible.

Palantir’s Database Of Americans

Palantir’s lethal footprint in Gaza foreshadows the grave risks its technology poses here at home. Founded in 2004 with a $2 million investment from the CIA’s venture arm In-Q-Tel, Palantir’s Gotham platform was originally built to “fuse” intelligence data for clandestine operations. Over the past decade, Gotham has quietly infiltrated local police fusion centers, ICE’s Investigative Case Management system, and even the Secret Service—each integration chipping away at due process and privacy.

Keep reading with a 7-day free trial

Subscribe to Out Loud with Ahmed Eldin to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Ahmed Shihab-Eldin
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share