Israel is deploying new and sophisticated artificial intelligence technologies at a large scale in its offensive in Gaza. The civilian death toll has mounted, and regional human rights groups are asking if Israel’s AI targeting systems have enough guardrails.
This push for accountability could force Washington to confront some uncomfortable questions about the extent to which the U.S. is letting its ally off the hook for its use of AI-powered warfare. In its strikes in Gaza, Israel’s military has relied on an AI-enabled system called the Gospel to help determine targets, which have included schools, aid organization offices, places of worship, and medical facilities. Hamas officials estimate more than 30,000 Palestinians have been killed in the conflict, including many women and children.
It’s unclear if any of the civilian casualties in Gaza are a direct result of Israel’s use of AI targeting. But activists in the region are demanding more transparency, pointing to the potential errors AI systems can make and arguing that the fast-paced AI targeting system is what has allowed Israel to barrage large parts of Gaza. Palestinian digital rights group 7amleh argued in a recent position paper that the use of automated weapons in war “poses the most nefarious threat to Palestinians.”
And Israel’s oldest and largest human rights organization, the Association for Civil Rights in Israel, submitted a Freedom of Information request to the Israeli Defense Forces’ legal division in December demanding more transparency on automated targeting. The Gospel system, which the IDF has given few details on, uses machine learning to quickly parse vast amounts of data to generate potential attack targets. The Israeli Defense Forces declined to comment on its use of AI-guided bombs in Gaza, or any other usage of AI in the conflict.
An IDF spokesperson said in a public statement in February that while the Gospel is used to identify potential targets, the final decision to strike is always made by a human being and approved by at least one other person in the chain of command. Still, the IDF noted in November that in addition to increasing accuracy, the Gospel system “allows the use of automatic tools to produce targets at a fast pace.” That same statement said that Israel had hit more than 12,000 targets in the first 27 days of combat.
The push for more answers about Israel’s AI warfare has the potential to reverberate in the U.S., creating demands for the U.S. to police tech of its allies abroad and creating a tricky policy area for U.S. lawmakers looking to use AI on future battlefields. Some who track AI warfare policy in the U.S. argue Israel is distorting the technology’s purpose — using it to expand target lists rather than protect civilians. And, they say, the U.S. should be calling out the IDF for that breach of ethics.
“It’s been clear that Israel has been using AI to have what they call ‘power targets’ so they are using it intentionally — as opposed to what it’s supposed to be, which is helping with precision — to target civilians,” said Nancy Okail, president of progressive foreign policy think tank the Center for International Policy. She said the IDF appears to be allowing for a broad definition of these “power targets” — which the military’s intelligence branch defines as “targets with security or perception significance to Hamas or the Palestinian Islamic Jihad.” With over 30,000 casualties in Gaza, it’s hard to tell if the IDF is using high-tech AI to identify targets or throwing darts at a map,” said Shaan Shaikh, deputy director and fellow with the Missile Defense Project at the Center for Strategic and International Studies.
“The U.S. should use its untapped leverage to shape these operations, but so far, the Biden administration has been unwilling to do so.” Yet so far Israel’s AI use in its military offensive hasn’t caught much attention in Washington discussions of the Israel-Hamas conflict. Human rights groups stateside say they’re more focused on Israel’s decision to target civilian infrastructure, rather than the technology used to do so.
As POLITICO has reported, aid organizations and medical facilities have been struck even after their GPS coordinates were provided to Israeli authorities. And Israel has said it considers civilian infrastructure like hospitals and schools a fair target because Hamas has hidden fighters and weapons in these buildings. U.S. officials have also largely avoided bringing up Israel’s AI use.
“I’ve been in a number of meetings with folks who are very human rights minded, in meetings with the administration, and I haven’t heard AI come up specifically,” Khaled Elgindy, the program director for Palestine and Palestinian-Israeli Affairs at the Middle East Institute, said. Asked about the scale of Israel’s use of AI targeting in war, White House Deputy National Security Adviser for Cyber and Emerging Technology Anne Neuberger was quick to pivot to the dangers of the technology in warfare writ large.
“We’re really concerned about AI systems,” Neuberger said in an interview. “That’s why the president worked so quickly to get out his executive order on AI and go from there. Absolutely.” The executive order she referenced — issued in October — provides guidelines for artificial intelligence use by the U.S. military and intel agencies, while also pushing to counter the weaponization of AI by foreign adversaries.
Another reason the technology may not be getting as much attention in Washington: The opaque nature of Israel’s military operations. “Nobody has any insight including, I would say U.S. policymakers on how Israel is conducting this war,” said Sarah Yager, the Washington director at Human Rights Watch. “We see the outcome in the civilian casualties and the destroyed buildings, but in terms of the technology and the proportionality calculus we just have no idea. It’s like a black box.”
But there are signs that Israel may not be employing oversight at the level the U.S. would want. Israel has not signed onto a U.S.-backed treaty pushing for the responsible use of AI in war. Among the more than 50 signatories are the United Kingdom, Germany and Ukraine — the other U.S. ally in an active war.

