أخبار المركز
  • سعيد عكاشة يكتب: (كوابح التصعيد: هل يصمد اتفاق وقف النار بين إسرائيل ولبنان بعد رحيل الأسد؟)
  • نشوى عبد النبي تكتب: (السفن التجارية "النووية": الجهود الصينية والكورية الجنوبية لتطوير سفن حاويات صديقة للبيئة)
  • د. أيمن سمير يكتب: (بين التوحد والتفكك: المسارات المُحتملة للانتقال السوري في مرحلة ما بعد الأسد)
  • د. رشا مصطفى عوض تكتب: (صعود قياسي: التأثيرات الاقتصادية لأجندة ترامب للعملات المشفرة في آسيا)
  • إيمان الشعراوي تكتب: (الفجوة الرقمية: حدود استفادة إفريقيا من قمة فرنسا للذكاء الاصطناعي 2025)

Blind Technology

Israel's AI Deployment in Gaza and Lebanon Wars

07 أكتوبر، 2024


Artificial intelligence (AI) has emerged as a powerful tool in numerous military conflicts worldwide, with applications ranging from analyzing intelligence data and launching drone strikes on military targets to conducting reconnaissance using military robots. Israel, however, has employed AI in a unique manner in its war against Hamas and Hezbollah, diverging from traditional approaches.

Traditionally, human operators would identify military and human targets, after which AI would take over to execute the mission. For example, a drone might assassinate a target identified by military leadership. However, when the scope of military targeting expands significantly, humans may struggle to manually identify, verify, and review all these targets, especially under time constraints that are not in favor of the combatant forces.

Given Israel's approach of placing nearly all Hamas and Hezbollah members in its crosshairs, the country has adopted a strategy that combines human decision-making with AI capabilities. In this innovative approach, AI is responsible for identifying, reviewing, and tracking targets, while humans execute the strikes using aircraft. This shift in methodology allows for a more efficient and comprehensive targeting process.

To implement this strategy effectively, Israel has employed several AI-powered systems: "The Gospel," "Lavender," and "Where's Daddy?" These sophisticated systems are designed to identify targets, leaving the execution of strikes to human operators. Each system serves a specific purpose in the targeting process. The Gospel system is utilized to pinpoint structures suspected of being used by Hamas and Hezbollah operatives. Complementing this, the Lavender system specializes in the identification, selection, and monitoring of individual human targets. To complete the targeting process, the system known as "Where's Daddy?" is employed to track these identified targets when they are in their residences, preparing for potential strike operations.

According to a report published by the Israeli news online magazine (+972 Magazine), these systems assist Israeli forces in deciding whom to target, with minimal human intervention in the decision-making process. This approach represents a significant shift in the use of AI in military operations, blurring the lines between human and machine decision-making.

This analysis aims to shed light on these three systems based on available information and explore the strategy Israel uses to integrate these technologies with its military personnel in its wars in Gaza and Lebanon. 

Human-Machine Team Strategy

In 2021, a groundbreaking book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World" was published. Its author, Brigadier General Y.S., believed to be Yossi Sariel, the head of Israel's elite Unit 8200, introduced a revolutionary machine capable of processing vast amounts of data at unprecedented speeds. This advanced technology can generate thousands of potential military targets that can be struck with extraordinary speed and precision, effectively overcoming the challenges humans face in target identification and decision-making during conflicts.

Far from being a futuristic concept, this machine has already been deployed in the war in Gaza. Multiple Israeli and Western reports confirm that Israel has utilized several AI-powered systems in conjunction with its military personnel to identify targets—both individuals and buildings—for automated destruction.

These AI-driven systems analyze enormous volumes of data to identify and automatically make decisions about targets based on real-time, updated, and synchronized information. This technology significantly accelerates the process of identifying and striking targets with high precision, functioning as a "target production factory." Moreover, it addresses what the book's author describes as the "human bottleneck" in decision-making, where human limitations traditionally slow down the process of target identification and decisions regarding their destruction. Consequently, the process evolves into a large-scale production line for targets designated for elimination.

To implement this strategy effectively, Israel established a new military intelligence unit within its Military Intelligence Directorate in 2019, dubbed the "Targeting Unit." This classified unit played a pivotal role in Israel's response to Hamas's attack on October 7, 2023. Leveraging AI technologies, it expedited target identification and transmitted recommendations to military forces via a smart application known as "Pillar of Fire," utilized by military commanders.

The effectiveness of this strategy is evident in the staggering number of daily airstrikes and targets hit by Israeli forces. Over 1,000 targets were struck daily, accompanied by numerous precision assassinations of enemy military commanders. While intelligence agencies undoubtedly played a significant role, this remarkable achievement would not have been possible without a comprehensive strategy that seamlessly integrated human efforts with AI, creating a synergistic relationship between man and machine.

"The Gospel" – Building and Structure Targeting System

On November 2, 2023, the Israeli army's website briefly highlighted the use of "The Gospel" system in the war on Gaza for rapid target generation. This sophisticated system's primary function is to automatically identify potential military strike targets, particularly buildings and structures that might serve military purposes. Designed to streamline the target identification process and decision-making, "The Gospel" significantly reduces the need for human intervention.

While detailed information about the system's inner workings remains limited, it likely operates similarly to other advanced targeting systems. These typically rely on a combination of satellite imagery, drone footage, surveillance camera feeds, intercepted communications, and intelligence on the movements and behaviors of individuals and groups. The system ingests this diverse data, establishing connections to identify buildings used by Hamas or other potential military targets. To pinpoint areas of interest, "The Gospel" employs several indicators such as buildings with additional fortifications or reinforced structures, locations regularly visited by large groups, or sites that have undergone structural changes compared to older images.

The system's capabilities extend beyond visual analysis. It utilizes multispectral image analysis to identify construction materials and distinguish between natural terrain and man-made structures, such as hidden tunnel entrances or trenches. Furthermore, "The Gospel" analyzes logistical movements in potential sites to detect military activity, including the transport of weapons or heavy equipment – often indicative of military preparations or the presence of a facility.

By integrating data from various sources – satellite imagery, radar, thermal sensors, and intelligence reports – "The Gospel" creates a comprehensive picture of Hamas activities and locations. This integration proved crucial in Israel's target list creation process.

The system's efficiency is remarkable. Aviv Kochavi, former Chief of General Staff of the Israeli army, noted that during Israel's 11-day war with Hamas in May 2021, "The Gospel" generated 100 targets daily, marking a significant increase from the previous rate of 50 targets per year in Gaza. Kochavi further stated that 50% of these generated targets were subsequently attacked.

According to the official Israeli military website, "The Gospel" system was developed by the renowned Unit 8200, an entity known for creating advanced intelligence and cyber technologies. This unit's expertise extends beyond "The Gospel," as they were also involved in the 2009 targeting of Iran's nuclear program through the Stuxnet worm.

Despite its significant impact, "The Gospel" system is relatively new in the public domain. It was first mentioned on the official website in 2020 as one of the projects that received the Israeli military's innovation award, highlighting its recent development and rapid integration into military operations.

“Lavender” – Human Targeting System

The "Lavender" system played a pivotal role in constructing a database of human targets affiliated with Hamas, resulting in unprecedented widespread bombings of Palestinians, particularly during the initial phases of the current Gaza conflict. This sophisticated system was engineered to identify all suspected members of the military wings of Hamas and Islamic Jihad, including those of lower ranks, designating them as direct bombing targets.

The system operates by ingesting data about known Hamas members and utilizing this information to analyze individuals' patterns and behaviors, thereby learning their general traits. This process enables the system to compare these characteristics with other potential targets. It then meticulously sifts through surveillance data on nearly every individual in Gaza, encompassing photos, phone contacts, and other pertinent information, to assess the likelihood of someone being a militant. Key indicators include participation in WhatsApp groups with known militants, frequent changes in mobile phones (every few months), or regular changes of address. Based on these factors, Palestinians are categorized according to their similarity to militants, and any individual whose traits align with those of a militant becomes a target for assassination, with minimal human intervention – a process confirmed by an investigative report from +972 Magazine.

During the initial weeks of the Gaza war, the "Lavender" system identified approximately 37,000 Palestinians as suspected militants and potential airstrike targets. This identification was the result of analyzing data collected on most of Gaza's 2.3 million residents through an extensive mass surveillance system. The system classified the likelihood of each person being active in Hamas' or Islamic Jihad's military wing, employing a scale from 1 to 100 to indicate the probability of an individual being a militant or a Hamas member.

Once "Lavender" determines that an individual is a Hamas militant, they become a target for bombing without the need for independent verification of the system's selection criteria or a review of the primary intelligence data upon which it relied. According to soldiers who worked with the system, as cited by the magazine, "Lavender" occasionally misidentifies individuals as targets simply because their communication patterns resemble those of Hamas or Islamic Jihad members. This misidentification can include police officers, civil defense workers, relatives of militants, residents who happen to share the same first and last name as a militant, or individuals in Gaza using a device that previously belonged to a Hamas member.

Furthermore, training the system based on communication profiles has made "Lavender" more susceptible to erroneously selecting civilians when applying its algorithms to the general population. A common error occurred when a targeted Hamas member transferred their phone to a son, older brother, or another man, resulting in that person being bombed in their home along with their family.

“Where’s Daddy?” – Home Targeting System

This system was specifically designed to track targeted individuals and carry out bombings when they enter their family homes, rather than attacking them during military activity. From an intelligence perspective, the likelihood of successfully hitting these individuals is much higher when they are at home, compared to military areas that often have fortifications. However, this approach fails to consider a crucial ethical concern: the attack targets an entire family, not just a single Hamas member. Consequently, this has been another significant factor contributing to the increased civilian death toll.

The system's effectiveness stems from the fact that every person in Gaza has a private home that can be linked to them, allowing Israeli military surveillance systems to easily and automatically associate individuals with their family residences. To capitalize on this, sophisticated programs have been developed to track thousands of individuals simultaneously and in real-time. When these systems identify the moment targeted individuals enter their homes, they send an automatic alert to a targeting officer, who then promptly strikes the house, affecting everyone inside.

The combination of the "Lavender" system (previously mentioned) and the "Where's Daddy?" system has led to devastating and deadly outcomes for civilian populations. This approach, while potentially effective from a military standpoint, raises serious questions about the ethical implications and long-term consequences of such indiscriminate targeting methods.

Fatal Errors

These technologies have revolutionized the process of killing, making it more automated and destructive, while blurring the line between civilian and military targets. A soldier now receives a computer-generated list of targets, often without understanding the underlying criteria for their selection. This raises critical questions: How accurate are these machine-generated recommendations? Do they contain varying degrees of error? Can we be certain that all identified targets are indeed the intended ones? Or have these systems, in their zeal to target anyone sympathetic to Hamas, overlooked lower confidence levels?

Furthermore, we must consider the quality of the data used to train these systems, the inherent biases in their algorithms, and potential errors in threat assessment. Such flaws are highly probable when dealing with intelligent systems, and when their primary function is to kill, the risk of targeting unintended or inappropriate targets increases significantly.

Moreover, the use of disproportionate destructive force relative to the size of the target can lead to an alarmingly high number of unintended casualties. In this scenario, these machines become blind rather than intelligent; their objective shifts from precise target selection to comprehensive destruction. The tragic result of this approach is evident in the staggering death toll, which has exceeded 41,000 in Gaza alone.