zate , to Random stuff
@zate@infosec.exchange avatar

Anyone worked on / working on any kind of risk classification/vector type measurement for the usage of within enterprises?

Basically, a way to classify a use case based on the risk it poses to the business.

Looking for others to chat with about it.

estelle , to Random stuff
@estelle@techhub.social avatar

The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:

  1. An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”

  2. An AI outputs "100 targets a day". Like a factory with murder delivery:

"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"

  1. "The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."

🧶

estelle OP ,
@estelle@techhub.social avatar

The first AI war was in May 2021.

stands for the Intelligence Division of the Israel army. Here is some praise of technology usage:

May 2021 "is the first time that the intelligence services have played such a transformative role at the tactical level.

This is the result of a strategic shift made by the IDI [in] recent years. Revisiting its role in military operations, it established a comprehensive, “one-stop-shop” intelligence war machine, gathering all relevant players in intelligence planning and direction, collection, processing and exploitation, analysis and production, and dissemination process (PCPAD)".

Avi Kalo: https://www.frost.com/frost-perspectives/ai-enhanced-military-intelligence-warfare-precedent-lessons-from-idfs-operation-guardian-of-the-walls/

(to be continued) 🧶

@ethics

estelle OP ,
@estelle@techhub.social avatar

Behind any aircraft that takes off for an attack, there are thousands of soldiers, men and women, who make the information accessible to the pilot. "They produce the targets and make the targets accessible. To set a target, it’s a process with lots of factors that need to be approved. The achievement, the collateral damage and the level of accuracy. For that, you have to interconnect intelligence, (weapon) fire, C4I [an integrated military communications system, including the interaction of troops, intelligence and communication equipment] and more," said Nati Cohen, currently a reservist in the Exercises Division of the C4I Division of the army.

Published in 2021 on the army's website: https://israeldefense.co.il/en/node/50155 @military

estelle OP ,
@estelle@techhub.social avatar

“Levy describes a system that has almost reached perfection. The political echelon wants to maintain the status quo, and the military provides it with legitimacy in exchange for funds and status.”

“Levy points out the gradual withdrawal of the old Ashkenazi middle class from the ranks of the combat forces[…]:
• the military’s complete reliance on technology as a decisive factor in warfare;
• the adoption of the concept […] of an army that is “small and lethal”;
• the obsession with the idea of , which is supposed to negate the other side’s will to fight; and
• the complete addiction to the status quo as the only possible and desirable state of affairs.”

https://www.972mag.com/yagil-levy-army-middle-class/ @israel @ethics @military @idf

18+ estelle OP ,
@estelle@techhub.social avatar

Here is a follow-up of
Yuval Abraham's investigation:

"The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties"
https://www.972mag.com/lavender-ai-israeli-army-gaza/

@israel @ethics @military @idf

estelle OP ,
@estelle@techhub.social avatar

It was easier to locate the individuals in their private houses.

“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Yuval Abraham reports: https://www.972mag.com/lavender-ai-israeli-army-gaza/

(to follow) 🧶 @palestine @israel @ethics @military @idf @terrorism

18+ estelle OP ,
@estelle@techhub.social avatar

The current commander of the Israeli intelligence wrote a book released in English in 2021. In it, he describes human personnel as a “bottleneck” that limits the army’s capacity during a military operation; the commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”

So people invented the machine to mark persons using AI. Then the army decided to designate all operatives of Hamas’ military wing as human targets, regardless of their rank or military importance.
Senior officer B.: “They wanted to allow us to attack [the junior operatives] automatically. That’s the Holy Grail. Once you go automatic, target generation goes crazy.”

https://www.972mag.com/lavender-ai-israeli-army-gaza/

estelle OP ,
@estelle@techhub.social avatar

“The was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it,” said a source who used .

“It has proven itself,” said B., the senior officer. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

Another intelligence source said: “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”

https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data

  • All
  • Subscribed
  • Moderated
  • Favorites
  • Mordhau
  • WatchParties
  • Rutgers
  • steinbach
  • Lexington
  • cragsand
  • mead
  • RetroGamingNetwork
  • mauerstrassenwetten
  • loren
  • xyz
  • PowerRangers
  • AnarchoCapitalism
  • kamenrider
  • supersentai
  • itdept
  • neondivide
  • space_engine
  • AgeRegression
  • WarhammerFantasy
  • Teensy
  • learnviet
  • bjj
  • khanate
  • electropalaeography
  • MidnightClan
  • jeremy
  • fandic
  • All magazines