Palestinian Territories – The role of major technology companies and international social media platforms in the killings of Palestinian civilians during Israel’s genocidal war on the Gaza Strip starting October 7, 2023 must be investigated. If these companies are found to have colluded or failed to take adequate precautions to prevent access to or misuse of your information, they will be held accountable. They must ensure that their services are not used in conflict areas and that user privacy is respected.
There are frequent reports that Israel uses a number of technological systems supporting artificial intelligence, such as Where’s Daddy?, Fire Factory, Gospel, and Lavender, to illegally track and monitor Palestinians. These systems build on potentially relevant information that is usually unrelated to the location or individual in question by looking for similarities and patterns among all residents of the Gaza Strip, especially men and members of armed groups. , potential suspects can be identified and classified as legitimate targets. Faction.
Studies have shown that although the Israeli military recognizes that the nature of these operating systems and their inability to provide accurate information, especially regarding the whereabouts of people on target lists in real time, can lead to significant errors, , the accuracy of the information provided by these systems is not verified.
For example, the Israeli military uses the Lavender system extensively to identify suspects in the Strip before they are targeted. This system intentionally results in large numbers of civilian casualties.
The Lavender system uses the logic of probability, a distinguishing characteristic of machine learning algorithms. The algorithm examines large datasets for patterns that correspond to fighter jet behavior. The quantity and quality of data determines how well the algorithm is able to find these patterns. It then recommends targets based on probability.
Amid concerns expressed about the Lavender system’s potential reliance on tracking social media accounts, Israeli military and intelligence sources have accused it of attacking potential targets without regard to principles of proportionality or collateral damage. I admitted that there was.
These suspicions are supported by a book written by the current commander of Israel’s elite unit 8200 (The Human Machine Team), which describes the use of a “targeting machine” similar to the Lavender artificial intelligence system. It describes how to create it. The book describes the severity of a person’s classification, such as changing their mobile phone every few months, moving addresses frequently, or simply joining the same group as a “combatant” on Mehta’s WhatsApp application. It also includes information on hundreds of signals that can be enhanced.
Additionally, it was recently revealed that Google and Israel are collaborating on several technology initiatives, including Project Nimbus, which provides the Israeli military with tools for increased surveillance and illegal data collection of Palestinians. Ta. This expands Israel’s policy of denial and persecution. Other crimes against Palestinians. The project, in particular, has sparked significant human rights criticism, prompting dozens of employees to protest and resign, and some fired for their protest activities.
The Israeli military also uses Google Photos’ facial recognition capabilities to monitor Palestinian civilians in the Gaza Strip and create “target lists.” We are collecting as many images as possible from the October 7 event, known as the Al-Aqsa Flood. The floods showed Palestinians breaking through the separation fence and entering the settlement. The technology was then used to classify photos and store facial images, resulting in thousands of Palestinians recently leaving the Gaza Strip, in violation of the company’s explicit rules and the United Nations Guiding Principles on Business and Human Rights. A person was arrested.
Euro Med Monitor’s field team has documented records of Palestinian civilians who were singled out as suspects by Israel as a result of their social media activity, despite not taking military action.
For example, a young Palestinian man, who requested to be identified only as “AF” due to security concerns, was seriously injured in an Israeli bombing raid on a house in Gaza City’s Al-Sabra neighborhood.
The home was targeted shortly after AF posted a video clip on Mehta’s Instagram account in which he joked that he was on a “field reconnaissance mission.”
His relatives told Euro-Med Monitor that AF was simply trying to imitate a journalist when he posted the short video clip on his personal Instagram account. But suddenly, AF was targeted by an Israeli reconnaissance plane on the roof of his house.
Another Israeli bombing on April 16 claimed the lives of six young Palestinians who had gathered to access internet services. One of the victims used a group chat on Mehta’s subsidiary WhatsApp to report news about Gaza City’s Sheikh Radwan district.
Relatives of the deceased man, who requested anonymity due to safety concerns, told Euro Med Monitor that the victims had lost access to an internet access point when the group was hit by a missile from an Israeli reconnaissance plane. I told him I was nearby. The victims had voluntarily shared news about the Israeli attack and the humanitarian situation in Sheikh Radwan with their families and public groups on the WhatsApp application.
The Israeli military’s secret strategy is to base highly harmful air and artillery attacks on data that falls short of minimum standards for accurate target assessment, from cell phone records and photos to social media contacts and communication patterns. It’s a start, and everything is happening within a larger framework. This incredibly haphazard murder plot is extremely concerning.
Evidence presented by global technical experts suggests a possible link between Meta and the Israeli military’s use of the Lavender system, which is used to identify targets in military attacks on the Gaza Strip. It points out something. This means that the Israeli military may have targeted individuals simply because they were in a WhatsApp group with other people on the suspect list. Experts also question how Israel was able to obtain this data without Meta disclosing it.
Earlier, the British newspaper The Guardian revealed that Israel had used artificial intelligence (Lavender) to kill a number of Palestinian civilians. The Israeli military used machine learning systems to identify potential “low-level” combatants without considering the level of collateral damage that would be tolerated. A “tolerance range” was adopted, allowing 20 civilian deaths for each target defeated. When targeting “higher ranking fighters”, this tolerance allows him 100 deaths per fighter.
Google, Meta, and other technology and social media companies may have colluded with Israel in crimes and violations, including extrajudicial killings, against Palestinians, in violation of international law and these companies’ stated human rights commitments. .
Social networks should not publish this type of personal data about their users or participate in Israeli genocide against Palestinian civilians in the Gaza Strip. An international investigation is necessary to ensure accountability for those responsible and justice for victims.
Meta’s blatant and obvious bias against Israel, its drastic suppression of content supporting the Palestinian cause, and its policy of suppressing criticism of Israeli crimes, including rumors of close ties between Meta executives and Israel, It has been suggested that he may be involved in the murder of Palestinians. Civilian.
Given the risk of not taking reasonable steps to demonstrate that its objectives are legitimate under international humanitarian law, the aforementioned companies have ended all cooperation with the Israeli military and are committed to supporting the rights of the Palestinian people. We need to be fully committed to providing Israel with access to infringing data and information. endangering their lives.
Israel’s failure to exercise due diligence and consider human rights when using artificial intelligence for military purposes, as well as its failure to comply with international law and international humanitarian law, requires immediate investigation.
These companies must promptly address information circulating about their involvement in Israeli crimes against Palestinians. Companies should launch serious investigations, where appropriate, into their policies and practices related to Israeli crimes and human rights abuses, and companies should take steps to prevent them from being complicit in crimes and from using user information for criminal activities. shall be held liable if it is found that they have not taken reasonable precautions to prevent such damage. .
