Домой United States USA — software The Destruction in Gaza Is What the Future of AI Warfare Looks...

The Destruction in Gaza Is What the Future of AI Warfare Looks Like

65
0
ПОДЕЛИТЬСЯ

American tech companies have given highly consequential support to Israel’s campaign.
In 2021, Israel used “the Gospel” for the first time. That was the codename for an AI tool deployed in the 11-day war against Gaza that the IDF has since deemed the first artificial intelligence war. The conclusion of that war didn’t end the conflict between Israel and Palestine, but it was a sign of things to come.
The Gospel rapidly spews out a mounting list of potential buildings to target in military strikes by reviewing data from surveillance, satellite imagery, and social networks. That was four years ago, and the field of artificial intelligence has since experienced one of the most rapid periods of advancement in the history of technology.
Marking two years on Tuesday, Israel’s latest offensive on Gaza has been called an “AI Human Laboratory” where the weapons of the future are tested on live subjects.
Over the last two years, the conflict has claimed the lives of more than 67,000 Palestinians, upwards of 20,000 of whom were children. As of March 2025, more than 1,200 families were completely wiped out, according to a Reuters examination. Since October 2024, the number of casualties provided by the Palestinian Ministry of Health has only included identified bodies, so the real death toll is likely even higher.
Israel’s actions in Gaza amount to a genocide, a UN Commission concluded last month.
Hamas and Israel agreed to the first phase of a ceasefire deal that was announced on Wednesday, but Israeli strikes on Gaza were still continuing as of Thursday morning, according to Reuters. The agreed-upon plan involves the release of Israeli hostages by Hamas in exchange for 1,950 Palestinians taken by Israel and the long-awaited aid convoys. But it does not involve the creation of a Palestinian state, which Israel strictly opposes. On Friday afternoon, Israel said that the ceasefire agreement is now in effect, and President Trump has said there will be a hostage release next week. There have been at least three ceasefire agreements since October 7, 2023.
Aiding Israel’s destruction in Gaza is an unprecedented reliance on artificial intelligence that is, at least partially, supplied by American tech giants. Israel’s use of AI in surveillance and wartime decisions has been documented and criticized time and again by various media and advocacy organizations over the years.
“AI systems, and generative AI models in particular, are notoriously flawed with high error rates for any application that requires precision, accuracy, and safety-criticality,” Dr. Heidy Khlaaf, chief AI scientist at the AI Now Institute, told Gizmodo. “AI outputs are not facts; they’re predictions. The stakes are higher in the case of military activity, as you’re now dealing with lethal targeting that impacts the life and death of individuals.”AI that generates kill lists
Although Israel has not disclosed its intelligence software fully and denied some of the AI usage claims, numerous media and non-profit investigations paint a different picture.
Also used in Israel’s 2021 campaign were two other programs called “Alchemist,” which sends real-time alerts for “suspicious movement,” and “Depth of Wisdom” to map out Gaza’s tunnel network. Both are reportedly in use this time around, as well.
On top of the three programs Israel has previously openly owned up to using, the IDF also utilizes Lavender, an AI system that essentially generates a kill list of Palestinians. The AI calculates a percentage score for how likely a Palestinian is to be a member of a militant group. If the score is high, the person becomes the target of missile attacks.
According to a report from Israeli magazine +972, the army “almost completely relied” on the system at least in the early weeks of the war, with full knowledge of the fact that it misidentified civilians as terrorists.
The IDF required officers to approve any of the recommendations made by the AI systems, but according to +972, that approval process just checked whether or not the target was male.
Many other AI systems that are in use by the IDF are still in the shadows. One of the few programs also unveiled is “Where’s Daddy?” which was built to strike targets inside their family homes, according to +972.
“The IDF bombed [Hamas operatives] in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations,” an anonymous Israeli intelligence officer told +972.AI in surveillance
The Israeli army also uses AI in its mass surveillance efforts.

Continue reading...