AI Weapons, Used In Places Like Gaza, Should Be Prohibited

This is a contributing content by Manouk (Mark) TermaatenFounder and CEO at Vertical Studio AI.

AI weapons are used on the battlefield, when they should be violated. Time now for countries to sign up for agreements that will prohibit the use of these weapons in the war.

US giant techs are specifically the formation of AI and computer tools to monitor and kill alleged militants in Gaza and Lebanon. Despite strong high technology, the number of civilians killed contributed to concerns that the tools result in the death of innocent people possibly due to defective data and flawed algorithms.

For example, in October 2023, the Israel Defense Force (IDF) reportedly included AI in old technology to find Ibrahim Bari, the commander of Central Jabalia Battalia of Hamas. The attack resulted in twelve civilian deaths, according to reports. All targets in the past three years have been Found with Ai.

IDF calls AI a “game changer” for war. Officials call AI a “force-multiplier,” which emphasizes them to improve their strike capabilities. The Israeli military began the so-called “first AI War” with the 11-day Hamas bombing campaign in May 2021.

The AI’s IDF employment in its wars has made it possible by soldiers and reservists working in the technology industry, according to a report on The New York Times.

The use of IDF of these weapons is multifaceted. For example, idf, has included a Chatbot Learned in Arabic dialects as a way of monitoring public sentiment.

Their tools also include an AI audio program capable of finding targets based on sounds “such as sonic bombs and airstrikes.”

Moreover, the IDF has deployed facial recognition technology designed to identify hidden or injured faces. That equipment made the face wrong.

The idf’s Virtual reality system AIDS soldiers who have scanned urban zones.

AI, however, showed flaws. The Times reported that some officials had ethical qualifications for the use of technology in war. Their concerns revolve around increasing monitoring, civilian death and arrest of Innocents.

As part of its lavender program, IDF has been reported to have combined a 37,000 target list of targets based on Hamas connections. The military Refusal The list exists. Lavender ranked people on a 0-100 scale based on the possibility of them being militant. Ranking is based on a variety of factors, including a person’s family.

IDF’s The Studio in Unit 8200 has developed an AI language model in various Arabic dialects trained in decades of blocked texts, phone calls, and social media posts. The system is crude enough that intelligence officials should double-check the output.

Project Nimbus is a cloud computing project of the Israeli government and its military
Project Nimbus is a $ 1.2 billion contract with cloud computing between the Israeli government and military, and Google and Amazon. The project aims to provide cloud infrastructure, AI, and other services to support the digital transformation of the Israeli government and its military. Image Refusal: This image is for aspirations that describe only and not directly related to Project Nimbus.

Google, a major contractor for the Israeli government since 2021, has provided Israeli’s defense ministry and Israel’s defense service, including access to the latest technology under an IDF program, Nimbus, a cloud computing contract aimed at making upgrading upgrades to the Israeli government upgrade.

The program has been confirmed By Gaby Portnoy, director of the National National Cyber ​​Directorate of Israel, to be for the battle. Google also offers vertex clients, which clients can use to adjust their own data to AI algorithms.

In November 2024, a Google employee requested access to Gemini Ai technology On behalf of the IDF, an AI wants to process documents and audio. The government of Israel also contracted Amazon.

“Thanks to the Nimbus Public Cloud, amazing – wonderful things that are happening during the fight, these things play an important part in success – I will not explain,” he said.

Ever since the IDF opened its front in Gaza, it has used the habsora, a tool built within which provided Commanders with thousands of people and infrastructure for bombings, contributing to Gaza violence. Habsora is made of road -of algorithms.

These algorithms study data, including blocked The imagination of communication and satellite to determine where the military targets were, though some Israeli commanders raised an alarm about the accuracy of the system. Some express concerns that the military puts much confidence in the recommendations of technology.

Mistakes can still occur for many factors involving AI, Israeli military officials said working on targeting systems and other tech experts. An intelligence official said he saw the target of errors that relied on improper translation of the machine from Arabic to Hebrew.

The IDF has long been ahead of the removal of artificial intelligence for military use. In early 2021, it launched the Gospel, an AI tool that classified a wide range of digitized information to suggest targets for potential strikes. It also develops a lavender, which uses machine study to filter out the requested standards from intelligence databases and narrow lists of potential targets, including people.

Commercial AI models are used in war. Cloud and AI are used to deploy bombs and ammunition. Tech companies empower the military of Israel with digital weapons.

These AI weapons are rugged. They should be prohibited. Clearly, AI -powered weapons increase ethical concerns. AI between farmers and civilians does not recognize, does not understand the proportionality, and puts the greater risk of civilians. AI can ruin global security, create a new weapon breed, and make these tools available in terrorist groups, and more. Countries need to sign in to an agreement that prohibits the use of these weapons.

Also Read: Verifiability is a major element of AI change

Denial: This is an article of contributing, a free service that allows blockchain, crypto, and AI industry professionals to share their experiences or opinions with the Alexablockchain audience. The above content was not created or reviewed by the Alexablockchain team, and Alexablockchain clearly rejects all guarantees, whether expressed or indicated, about accuracy, quality, or content reliability. Alexablockchain does not guarantee, endorses, or accept responsibility for content in any way. This article is not intended to serve as investment advice. Readers are advised to independently verify the accuracy and relevance of any information provided before making any content -based decisions. Please submit an article, please Contact us by email.

Image credits: Canva

Leave a Comment