The Israeli military's electronic spying Unit 8200 has used a vast collection of intercepted communications to develop an artificial intelligence model similar to ChatGPT aimed at snooping on Palestinians, a new investigation reveals.
The probe, whose findings were released on Thursday, was conducted by the British daily The Guardian, the Israeli-Palestinian publication +972 Magazine, and the Hebrew-language outlet Local Call.
It said Unit 8200 trained the AI model to understand spoken Arabic using large volumes of telephone conversations and text messages, obtained through its extensive surveillance of the occupied territories.
The spying unit accelerated its development of the tool after the start of Israel's genocidal war on the Gaza Strip on October 7, 2023 and trained it in the second half of 2024, the probe noted, adding that it is not clear whether the model has yet been deployed by the occupation's army.
“We tried to create the largest dataset possible [and] collect all the data ... Israel has ever had in Arabic,” the former official, Chaked Roger Joseph Sayedoff, told a military AI conference in Tel Aviv last year. The model, he emphasized, required “psychotic amounts” of data.
Meanwhile, sources familiar with the project said the sophisticated chatbot-like tool is capable of answering questions about people and providing insights into the massive volumes of surveillance data.
“AI amplifies power,” one of the sources said. “It’s not just about preventing shooting attacks, I can track human rights activists, monitor Palestinian construction in Area C [of the occupied West Bank]. I have more tools to know what every person in the West Bank is doing.”
'Palestinians subjects in Israel’s laboratory'
Nadim Nashif, director of 7amleh digital rights and activist group, said Palestinians have “become subjects in Israel’s laboratory to develop these techniques and weaponize AI, all for the purpose of maintaining [an] apartheid and occupation regime."
Zach Campbell, a senior surveillance researcher at Human Rights Watch (HRW), said using surveillance material to train an AI model was “invasive and incompatible with human rights."
As an occupying regime Israel is obligated to protect Palestinians’ privacy rights, he added. “We’re talking about highly personal data taken from people who are not suspected of a crime, being used to train a tool that could then help establish suspicion."
During the Gaza onslaught, Israel used AI-powered tools such as Gospel and Lavender to generate thousands of targets for deadly airstrikes and assassinations.
Israel launched its brutal aggression against Gaza after the Hamas resistance group carried out a historic operation against the usurping entity in retaliation for its intensified atrocities against the Palestinian people.
The Tel Aviv regime failed to achieve its declared objectives of freeing captives and eliminating Hamas despite killing at least 48,446 Palestinians, mostly women and children, in Gaza.
Israel accepted Hamas’s longstanding negotiation terms under a three-phase Gaza ceasefire, which began on January 19.
Later, however, Israel disrupted the truce by refusing to move forward to it second stage.