• About
  • Advertise
  • Privacy & Policy
  • Contact
Saturday, February 28, 2026
  • Login
  • Register
thehopper.news
  • Home
    • Home
    • About
  • Analysis
  • Regions
    • Discussion
    • Africa
    • Asia-Pacific
    • Europe & NATO
    • Americas
    • Russia & Eurasia
    • Middle East & North Africa
  • Themes
  • Intel & Security
  • Foreign Affairs
  • Geopolitics
  • News
    • All
    • Politics
    • World
    FBI fires agents linked to Trump classified documents case

    FBI fires agents linked to Trump classified documents case

    EU manipulating polls to oust Orban – German opposition leader

    EU manipulating polls to oust Orban – German opposition leader

    Norway to cut perks for Ukrainian men of draft age

    Norway to cut perks for Ukrainian men of draft age

    African neighbors discuss major joint hydropower project

    African neighbors discuss major joint hydropower project

    Cuba accuses US of failing to curb expat militancy

    Cuba accuses US of failing to curb expat militancy

    Nigeria responds after US labels it ‘deadliest country to be Christian’

    Nigeria responds after US labels it ‘deadliest country to be Christian’

    German court blocks move to brand AfD ‘extremist’

    German court blocks move to brand AfD ‘extremist’

    The presidential curse: South Korean leaders tend to get bad lots

    The presidential curse: South Korean leaders tend to get bad lots

    Afghanistan has become Indian ‘colony’ – Pakistan

    Afghanistan has become Indian ‘colony’ – Pakistan

    Pentagon accidentally downs US border patrol drone – lawmakers

    Pentagon accidentally downs US border patrol drone – lawmakers

No Result
View All Result
thehopper.news
No Result
View All Result
Home News

Israel using AI to pick targets in Gaza – report

by Admin
April 4, 2024
in News, Politics, World
0
Israel using AI to pick targets in Gaza – report
27
SHARES
108
VIEWS
Share on FacebookShare on Twitter

Published: April 4, 2024 4:54 pm
Author: RT

The program is reportedly designed to detect Hamas operatives, but Israeli military sources say it often marks innocents for death

The Israeli military is using artificial intelligence to mark suspected Palestinian militants for assassination with little human oversight or regard for civilian casualties, the Israeli-Palestinian +972 Magazine reported on Wednesday.

The AI system, known as ‘Lavender’, is designed to comb through the personal data of Gaza’s two million residents to draw up lists of those suspected of serving in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), six Israeli intelligence officers told the magazine.

The IDF has never publicly acknowledged the existence of the system, but has been known to use similar software during previous operations in Gaza.

At the outset of Israel’s ongoing war on Hamas, Lavender marked 37,000 Palestinians as militants and placed them on kill lists, the sources claimed. Whereas Israel Defense Forces (IDF) personnel initially pored over these lists and manually verified each name, humans soon came to serve as rubber stamps for the machine’s lists, one source said.

Read more

WCK founder Jose Andres. A still from a video interview, 3 April 2024.
Israel targeted aid convoy ‘car by car’ – charity founder

“I would invest 20 seconds for each target at this stage, and do dozens of them every day,” the officer said. “I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time.”

Lavender works by studying phone records, social media activity, photographs, and movements of known Palestinian militants, identifying common characteristics, and then searching for these characteristics among the wider population of Gaza. The system gives each Gazan a score of between 0 and 100, with those ranked near 100 deemed to be terrorists and therefore legitimate targets. 

Within weeks of the war breaking out, however, IDF commanders were allegedly instructing their subordinates to relax this selection criteria and approve strikes on targets only tangentially linked with Hamas. 

“We were told: now we have to f**k up Hamas, no matter what the cost. Whatever you can, you bomb,” one source recalled.

Once marked for assassination, low-level targets would be taken out in their homes – identified using a different AI system called ‘Gospel’ – with unguided bombs, while more precise munitions would be used on higher-ranking militants.

Read more

Palestinians inspect the rubble following an August 2022 Israeli airstrike on a residential building in Gaza City.
Israel using AI in airstrikes – Bloomberg

“At 5 am, [the air force] would come and bomb all the houses that we had marked,” a source said. “We took out thousands of people. We didn’t go through them one by one – we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”

Lavender reportedly ranks targets by their perceived importance, with one source alleging that lower priority names often include policemen, civil servants, and others who “help the Hamas government, but they don’t really endanger [Israeli] soldiers.”

When a suspect is chosen and an assassination order given, IDF commanders decide how many civilian casualties they deem acceptable to take out the target. According to a source, this number “went up and down” over time, with “20 uninvolved civilians” deemed an acceptable sacrifice at the beginning of the war, and up to 100 considered okay in strikes on top-ranking Hamas officials. 

“It’s not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law,” they said. “But they directly tell you: ‘You are allowed to kill them along with many civilians.”


READ MORE: ‘She asked us to pray for her’ – family of World Central Kitchen volunteer killed in Israeli strike speaks to RT

According to the latest figures from Gaza’s health ministry, Israeli forces have killed more than 33,000 people in nearly six months of fighting in the enclave, most of them women and children. Responding to +972 Magazine’s claims, the IDF said on Wednesday that it “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” and that it “outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.”

Full Article

Tags: Russia Today
Share11Tweet7
Previous Post

German companies helping Russia rebuild Mariupol – media

Next Post

Iran promises Israel ‘slap in the face’

Admin

Admin

Next Post
Iran promises Israel ‘slap in the face’

Iran promises Israel ‘slap in the face’

thehopper.news

Copyright © 2023 The Hopper New

Navigate Site

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Follow Us

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms bellow to register

*By registering into our website, you agree to the Terms & Conditions and Privacy Policy.
All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • Home
    • Home
    • About
  • Analysis
  • Regions
    • Discussion
    • Africa
    • Asia-Pacific
    • Europe & NATO
    • Americas
    • Russia & Eurasia
    • Middle East & North Africa
  • Themes
  • Intel & Security
  • Foreign Affairs
  • Geopolitics
  • News

Copyright © 2023 The Hopper New

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.