IAF - Iron Swords - Artificial Intelligence
Israeli utilization of sophisticated AI technology in its campaign in Gaza marked new territory in modern warfare, adding to the legal and ethical scrutiny and reshaping the dynamics between military personnel and automated systems. "The machine did it coldly. And that made it easier,” said one intelligence officer. "I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time,” said another soldier. "Because of the system, the targets never end.” another added.
According to experts, if "Israel" has indeed utilized unguided bombs to destroy the residences of numerous Palestinians over the mere suspicion of having ties to Resistance groups in Gaza, aided by AI technology, this could provide a potential explanation for the significantly elevated civilian death toll during the war.
The Gospel system enables the IDF to run a “mass assassination factory” where it focuses on quantity, not quality. In the first days of the Israeli attack on the Gaza Strip, the commander of the Israeli air forces spoke of continuous air strikes “around the clock.” He said that his forces were only striking military targets, but added: "We do not perform surgical operations."
According to The Guardian , relatively little attention had been paid to the methods used by the Israeli army to select targets in Gaza, and to the role that artificial intelligence has played in the bombing campaign. As the IAF resumed its offensive after a seven-day ceasefire, there are growing concerns about the targeting method followed by the Israeli army in its war in Gaza.
The IAF had previously made bold but unverifiable claims regarding harnessing new technology. After the 11-day war in Gaza in May 2021, officials said that the IAF fought its “first war in the field of artificial intelligence” using machine learning and advanced computing. The war between Israel and Hamas has provided an unprecedented opportunity for the Israeli army to use such tools in a much broader theater of operations, and in particular, to deploy an artificial intelligence target-making platform called “The Bible.”
The Guardian newspaper revealed new details about this technology and its central role in the aggression against Gaza, using interviews with intelligence sources and unremarkable statements made by the Israeli army and retired officials. The slowly emerging picture of how the Israeli military will harness artificial intelligence comes against a backdrop of growing concerns about the risks to civilians as advanced militaries around the world expand the use of complex robotic systems on the battlefield.
A former White House security official said that the war between Israel and Hamas would be “an important moment if the Israeli army uses artificial intelligence to make targeting choices with life-or-death consequences.” In early November, the IDF confirmed that “more than 12,000” targets had been identified in Gaza by the Department’s Target Division.
The activities of the department that was formed in 2019 in the Israeli Army Intelligence Directorate are considered secret. However, a short statement on the Israeli military's website claimed that it is using an artificial intelligence-based system called Habsura in the war against Hamas to "produce targets at a rapid pace." The IDF said that “through rapid and automated extraction of intelligence information,” the intelligence platform issued targeted recommendations to its researchers “with the aim of fully matching the machine’s recommendation with the recognition carried out by the person.”
Multiple sources familiar with the targeting operations carried out by the Israeli army confirmed the issuance of automated recommendations to attack targets, such as the homes of Palestinian resistance members. In recent years, the Targets Division has helped the IDF build a database of what sources said was between 30,000 and 40,000 suspects, adding that systems like the Bible played a crucial role in building lists of individuals authorized to be assassinated.
In an interview published before the war, former Israeli army commander Aviv Kochavi said, “This machine produces huge amounts of data more effectively than any human and translates it into targets for attack.” According to Kochavi, “once this machine was activated” in Israel’s 11-day war with Hamas in May 2021, it scored 100 targets per day. “To put that in mind, in the past we were producing 50 targets in Gaza annually. Now, it produces twice that number per day.”
The Israeli army's use of advanced artificial intelligence programs in its war against Hamas raised some controversy regarding the accuracy of its data and its ability to identify targets that include the leaders and militants of that group classified as terrorist in the United States and other countries, according to what the British newspaper "The Guardian" reported.
The newspaper explained that the Israeli army uses an artificial intelligence program called “The Gospel,” noting that it is fed with data to select “targets” to be bombed in the Gaza Strip, “which include armed groups and their leaders.” The newspaper reported that a secret military intelligence unit, managed by artificial intelligence, plays an important role in Israel’s response to the Hamas attacks that occurred in the south of the country on October 7, which resulted in the killing of 1,200 people, most of them civilians and including women and children, according to the Israeli authorities.
In a brief statement by the Israeli army regarding that unit, a senior official said that its members are carrying out “precise attacks on infrastructure linked to Hamas, and significant damage is being caused to the enemy, with losses occurring on the part of civilians.”
The accuracy of the strikes recommended by the “Artificial Intelligence Target Bank” was confirmed in multiple reports in the Israeli media, with the daily Yedioth Ahronoth newspaper reporting that the unit “is ensuring as much as possible that no harm occurs to civilians who are not participating in operations.” Combat." A former senior Israeli military source told the Guardian that the artificial intelligence software supervisors “use a very precise measurement” of the rate at which civilians evacuated the building shortly before the raid.
The commander of the Israeli Air Force, Omer Tishler, said in an interview with the local newspaper “The Jerusalem Post” last October that artificial intelligence targeting capabilities “enabled the army to identify new target banks more quickly.” Tishler stressed that the Israeli army “does not target civilians,” despite the Ministry of Health in Gaza saying that more than 15,000 people were killed, most of them civilians, including women and children.
An Israeli official had explained to the American “ Bloomberg ” agency in mid-June that two models were being used to build a data set based on algorithms regarding specific targets, in order to calculate potential ammunition, determine priorities, and assign thousands of targets to aircraft and drones, and propose a schedule for raids. That official explained that both systems are controlled by human operators who examine and approve targets and air strike plans.
According to figures published by the Israeli army in November, during the first 35 days of the war, Israel attacked 15,000 targets in Gaza, a much higher number than previous military operations in the densely populated coastal region. In comparison, with the 2014 war, which lasted 51 days, reports indicated that the Israeli army targeted between 5,000 and 6,000 targets. Despite increasing pressure on the Israeli government to stop its attack on Gaza, Israeli Prime Minister Benjamin Netanyahu and his team opposed providing any clear timetable.
Multiple sources told the Israeli newspaper "The Guardian", "972+ Mag" magazine, and the "Local Call" website, that when a raid is authorized to be launched on the private homes of individuals identified as Hamas or Islamic Jihad activists, the supervisors of the artificial intelligence programs are aware in advance on the number of civilians expected to be killed.
These sources explained that each target has a file containing collateral damage, which states the number of civilians who were likely to be killed in that raid. One of the sources, who worked until 2021 in planning strikes for the Israeli army, said that “the decision to strike is made by the unit commander while on duty,” noting that there were times “when there was doubt about the target... and we killed what I believe to be a disproportionate number of civilians.” According to the Guardian.
On the other hand, an Israeli military spokesman said: “In response to Hamas’ brutal attacks, the IDF is working to dismantle Hamas’ military and administrative capabilities.” He continued: "In stark contrast to Hamas' deliberate attacks on Israeli men, women and children, the IDF follows international law and takes feasible precautions to mitigate harm to civilians."
Sources familiar with how artificial intelligence-based systems are integrated into Israeli army operations said that such tools “contributed significantly to accelerating the target identification process.” A source who previously worked in that unit told Israeli websites: “We prepare a list of targets automatically, and we work according to them.”
Israeli Defense Minister Yoav Galant hung a poster on the wall of his office in Tel Aviv in the wake of the October 7 attack launched by the US-listed Hamas movement on southern Israel, bearing pictures of hundreds of leaders of the Palestinian armed group arranged in a pyramid shape. He added: "It's like working in a factory. We work quickly and there is no time to delve deeply into the goal. The point of view is that we are judged according to the number of goals we are able to achieve."
For some experts researching AI and international humanitarian law, the expanding use of AI raises a number of concerns. In this regard, Marta Bo, a researcher at the Stockholm International Peace Research Institute, said that even when “there are humans supervising the program, there is a risk that they will develop automated bias, meaning over-reliance on systems that have too much influence on matters.” "Then one does not rely on human decisions that are more complex."
For his part, researcher Richard Moyes, who heads Article 36, a group that campaigns to reduce the harm caused by weapons, said: “When relying on artificial intelligence tools like (Gospiel), humans are handed a list of targets generated by the computer, and they... “They don’t necessarily know how the list was created and they don’t have the ability to verify and question targeting recommendations appropriately.”
He added: "There is a danger that when humans rely on these systems, they become cogs in a mechanical process, and lose the ability to consider the risks of harm to civilians in a meaningful way."
According to a joint report conducted by the Israeli magazine “972+” and the Israeli website Local Call, which relied on the testimonies of seven current and former members of Israeli intelligence, in addition to Palestinian testimonies, data and information, there are several reasons behind this frightening number of victims, the most prominent of which is the unleashing of artificial intelligence systems in choosing... Objectives.
From the first moment after the attack of last October 7, Israeli decision-makers explicitly announced that the response would be of a completely different magnitude than previous military operations in Gaza, with the stated goal of eliminating Hamas completely, as they claim. “The focus is on damage, not accuracy,” IDF spokesman Daniel Hagari said on October 9, and the army quickly translated these statements into action. According to the sources who spoke to “972+” and Local Call, the targets set by the occupation army command and bombed by aircraft in Gaza can be divided into four categories:
- The first category was called “tactical targets,” and included military targets such as armed elements, weapons depots, missile launchers, command centers, observation points, and the like.
- The second category is “underground targets”, which are the tunnels used by Hamas, and the air strikes that targeted them led to the collapse of the homes located above or near the tunnels. (See the following video, which shows Israeli aircraft targeting sites in Gaza)
- The third category, which they called “power targets,” includes high-rise buildings and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices. The idea behind striking such targets - according to three intelligence sources involved in planning or carrying out the strikes - is “to generate a state of discontent and pressure towards Hamas from civilians.”
- The fourth category is “family homes” or “activist homes.” This type was a major reason for the dangerously high death toll. According to sources, the Israeli forces, in order to destroy the home of a Hamas member located inside a several-story residential building, destroyed the entire building.
An IDF spokesman stated on 11 October 2023 that during the first five days of fighting, half of the targets bombed (1,329 out of a total of 2,687) were considered power targets. The report quoted a number of sources within the Israeli army, saying that the Israeli army has files on most of the targets being bombed in Gaza, including the number of civilians inhabiting each building, and advance information about the number of victims who fall as a result of the bombing. Sources narrate that on one occasion, the Israeli military leadership agreed to kill hundreds of Palestinian civilians in an attempt to assassinate a senior Hamas military leader.
In recent years, Israel has moved to strengthen its military arsenal with artificial intelligence systems. As is known to specialists, the basis of the work of these systems are several things, the most important of which is “ data ”. Israel possesses a large amount of information thanks to spyware, satellites, and its control over mobile phone companies, in addition to facial recognition systems and biometric data present at Israeli crossings. Another important thing in the work of artificial intelligence systems is the precise and sophisticated algorithms that translate this data into decisions.
As for the accuracy of the algorithms’ work, it depends on the conditions set by the programmers, and in the current war (2023) the conditions that were previously in place were removed, which is the number of victims allowed when carrying out any targeting, which is five people. Israel used an artificial intelligence system called “Gospel” (literally translated “Gospel”), and this system had the conditions for the number of victims removed from its algorithms.
In other words, the system's mission is to generate targets after interrupting the data and deliver them to the air forces or drones, regardless of the expected number of casualties. Another condition was also removed from the work of the algorithms, which is the rank of the targeted person. Previously, Israel targeted leaders, but this “Bible” system does not differentiate between the house of a leader or a fighter, and as long as the number of members of the Qassam Brigades is between 30 and 40 thousand members, imagine the amount of targets expected to be bombed and the size of the victims. Civilians resulting from this bombing.
According to interviews conducted by the British newspaper The Guardian , the Gospel system generates a target bank of 100 targets per day, while Israeli intelligence used to complete 50 targets per year. According to figures published by the Israeli army last November 2023, during the first 35 days of the war, Israel attacked 15,000 targets in Gaza, a much higher number than previous military operations in the region. Compared to the 2014 war , which lasted 51 days, the Israeli army struck between 5,000 and 6,000 targets.
One former intelligence officer explained that Gospel's system enables the military to run a "mass assassination factory", where "the emphasis is on quantity, not quality."
Artificial intelligence companies in Israel have made huge profits since their inception, as they use citizens in Gaza and the West Bank as a testing ground for the programs they design. An example of this is when graduates of the Israeli military intelligence sector in 2013 designed a mapping application called Waze, which they tested on residents of the West Bank, and Google acquired it in the same year for $1.3 billion. The use of artificial intelligence systems for the army in Israel began in 2019. The Israeli army established a new center aimed at using artificial intelligence to speed up the process of generating targets.
Former Israeli Army Chief of Staff Aviv Kochavi said in an in-depth interview with the Israeli website Ynet at the time: “The new target generation department is a unit that includes hundreds of officers and soldiers, and relies on artificial intelligence capabilities. ” The first product of that unit was the Fire Factory program , whose mission was to generate targets, determine appropriate quantities of ammunition, and propose a schedule for air strikes, with a requirement that collateral damage for each operation not exceed five civilians.
To test Fire Factory's capabilities, Bloomberg reports that in 2021, the Israeli military described the 11-day bombing of Gaza as the world's first "artificial intelligence war," and noted its use of the program to identify targets and deploy drones. Despite Israel's boasting of the accuracy of the work of these systems, what was revealed in the current battles proved what experts had been warning about, regarding the danger of integrating intelligence systems into weapons and using them in battles.
Katherine Connolly, a researcher at the Stop Killer Robots group, said: “Any change in the software could make these systems and weapons not only become semi-autonomous, but rather become completely independent in decision-making. ” On the same level , Anthony Lowenstein, an independent journalist and author of the book “The Palestinian Laboratory,” who lived in East Jerusalem from 2016 to 2020, said : “The claim of artificial intelligence was about targeting people more successfully, but what we see of targeting is not at all accurate.” "The launch; large numbers of civilians are dying. A third of the homes in Gaza have been destroyed. This is not precise targeting."
Haaretz reported 03 April 2024 the Israeli army's operation in Gaza during the war was assisted by a data system known as Lavender based on artificial intelligence, with the help of which it incriminated 37,000 men as potential operatives of Hamas and Islamic Jihad. Haaretz reported "the army defined quotas for killing uninvolved civilians before certain assassinations were carried out ... in the first weeks of the war, the army authorized attacks on junior Hamas operatives that might have killed between 15 and 20 civilians besides themselves. According to the sources, attacks on such targets were often carried out with dumb bombs that destroyed entire houses and killed their occupants."
Yuval Abraham reported" the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.... whereas The Gospel marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list....
"Lavender’s calculations were considered accurate only 90 percent of the time; in other words, it was known in advance that 10 percent of the human targets slated for assassination were not members of the Hamas military wing at all....
"But the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is. There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.... a major reason for the unprecedented death toll from Israel’s current bombardment is the fact that the army has systematically attacked targets in their private homes, alongside their families — in part because it was easier from an intelligence standpoint to mark family houses using automated systems....
"One source said that when attacking junior operatives, including those marked by AI systems like Lavender, the number of civilians they were allowed to kill alongside each target was fixed during the initial weeks of the war at up to 20. Another source claimed the fixed number was up to 15. These “collateral damage degrees,” as the military calls them, were applied broadly to all suspected junior militants..."
The "Lavender" system was developed by the elite intelligence division of the Israeli Defense Forces, Unit 8200. Attacks carried out after identifying the targets by "Lavender" used unguided munitions known as "dumb bombs", which led to the destruction of entire homes and the killing of all their residents. “You don't want to waste expensive bombs on unimportant people, they are very expensive for the country, and there is a shortage of them,” one intelligence officer said.
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one said. “It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
"No one thought about what to do afterward, when the war is over, or how it will be possible to live in Gaza,” one said. “There was a dissonance: on the one hand, people here were frustrated that we were not attacking enough. On the other hand, you see at the end of the day that another thousand Gazans have died, most of them civilians,” one Israeli intelligence officer who used Lavender asserted.
The Guardian emphasized that the use of artificial intelligence techniques raises a range of legal and ethical questions, and brings about a shift in the relationship between military personnel and machines. It explained that international humanitarian law experts who spoke to the newspaper expressed their concern about talk about the Israeli army accepting collateral damage rates of up to 20 civilians, for lower-ranking militants. They said that armies must assess the proportionality of each strike individually. On the other hand, an Israeli army statement stated that an artificial intelligence system is not used to identify “terrorists,” and stressed that these systems are “mere tools for analysts in the process of identifying targets.” The Israeli army said that its operations were carried out in accordance with the rules of proportionality under international law.
In October 2023, The New York Times reported on a system operated from a special base in southern Israel, which collects information from mobile phones in the Gaza Strip and provided the military with a live estimate of the number of Palestinians who fled the northern Gaza Strip southward.
The Secretary-General of the United Nations, António Guterres, expressed grave concern 05 April 2023 about information about the Israeli army’s use of artificial intelligence in the Gaza war. Guterres said that "technology should be used for good, not to kill." "I am deeply concerned by reports that the Israeli army's bombing campaign includes artificial intelligence as a target identification tool, particularly in densely populated residential areas, which has led to a high level of civilian casualties," Guterres told reporters.
A text of the United Nations Human Rights Council condemned “Israel’s use of explosive weapons with a wide-ranging effect in the populated areas of Gaza” and the use of artificial intelligence “to assist in the military decision-making process,” considering that this “may contribute to international crimes.” The Secretary-General pointed out that the war in Gaza is one of the most deadly conflicts for civilians and relief workers, adding that the Israeli military campaign during the past six months has brought death and destruction to the Palestinians in Gaza.
NEWSLETTER
|
Join the GlobalSecurity.org mailing list |
|
|