How does Israel’s Gospel AI system work? Which selects bombing targets in seconds

In today’s era, wars are not being fought only with bullets and tanks, but are being decided by ‘codes’ and ‘algorithms’ written in closed rooms of computers. Imagine a machine that scans the data of thousands of people in the blink of an eye and tells in seconds which person has been targeted or which building has to be bombed. This is not the story of a science-fiction film, but reality. Israel’s army is using a similar Artificial Intelligence (AI) system, whose name is ‘Habsora’ or ‘Gospel’.

This system works like a ‘digital eye’ on the battlefield, picking up targets that humans might not be able to find even after weeks of hard work. But is it right to trust the machine so much? Can a software decide whose life should be taken and whose life should not be taken? There are controversies and criticisms about this, but Israel is using this technology indiscriminately. Come, let us understand what this ‘Gospel’ system is? And how this has become a topic of discussion for the whole world.

What is gospel? How does it work?

‘Gospel’ or ‘Habsora’ is an extremely powerful AI based system developed by the Israeli military. It has been prepared by Israel’s famous intelligence unit ‘Unit 8200’. This is a huge ‘data processing plant’. This system works day and night and collects information from different sources.

  • Satellite photos: Photographs of every small and big activity taken from the sky.
  • Drone footage: Live video from drones flying over the war zone.
  • Electronic Signal: Mobile phone conversations, radio messages and Internet use.
  • Old Record: Old database of enemy locations.

This AI system connects millions of pieces of information within a few seconds. It sees in which house there is suspicious movement, where there might be a stockpile of weapons or from where rockets might be fired. After this, this system creates a list of those places and gives suggestions to the army officers for bombing.

Its speed is 50 times faster than humans

The most surprising feature of this new AI system of Israel is its incredible speed, which works many times faster than the human brain. If we talk about the old times, about 20 experienced military intelligence officers could hardly select 50 to 100 targets after a year’s hard work by scanning maps, photographs and intelligence documents. But the Gospel (Habsora) has completely changed this traditional method of warfare. While it would take months for humans to find targets, this system identifies more than 200 precise military targets within just 10 to 12 days.

In today’s digital warfare policy, its capability has increased so much that this system alone has the power to suggest about 100 possible targets in a day. This is why it is known as ‘Target Factory’ within the Israel Army. Just as in a modern factory, goods are produced very fast through machines, similarly this AI system keeps preparing a list of new targets for the battlefield without getting tired and without stopping. Technically speaking, it is proving to be about 50 times more effective than humans.

Used extensively in Gaza war

Israel showed a trailer of this system in the 2021 conflict, but it was used on a large scale in the 2023 Gaza war. According to statistics, with the help of this system the Israeli Army identified more than 12,000 potential targets.

Not only this, another system works with ‘Gospel’ which is called ‘Fire Factory’. As soon as Gospel selects the target, the fire factory decides which fighter jet will go to attack that target, how many kilos of bomb will be used and at what time the attack will be carried out. That means, from selecting the target to attacking, everything happens under the control of the computer.

Difference between gospel and lavender?

In Israel’s military technology, ‘Gospel’ and ‘Lavender’ are two different weapons, which complement each other. It is important to understand the difference between these two because their purpose is completely different. While the main function of ‘Gospel’ (Habsora) is to identify physical structures i.e. buildings, offices, tunnels or suspicious houses as targets, Lavender (Lavender) focuses entirely on humans.

The workings of the Lavender System seem like something out of a horror movie. It scans data from about 2.3 million residents of Gaza and gives each person a score between 1 and 100 through AI. This score is decided on the basis of how similar the person’s activities are to those of any extremist organization. The person whose score is higher is categorized as ‘terrorist’ by the system. Simply put, Gospel decides ‘where’ to bomb, while Lavender decides ‘who’ to target.

Why is there controversy over AI systems?

When a machine selects a target, it only looks at data, not emotions. And this is what makes this system controversial. Critics say:

  • margin of error: If the phone of an innocent citizen accidentally comes in contact with a suspect, then AI can also put him in the target list.
  • Collateral Damage: The machine tells that there are weapons in the building, but it cannot see whether children are playing next to that building or there is a hospital.
  • Lack of Accountability: If an innocent person dies due to AI’s mistake, who will be responsible? The engineer who created the software or the officer who simply pressed the button on the computer?

Israeli military claims versus criticism

Experts say that we have reached an era where the decision of war depends on OODA Loop (Observe, Orient, Decide, Act – i.e. observe, understand, decide and attack). AI speeds up this cycle so much that humans don’t even get a chance to think.

The Israeli military claims that the final decision is always taken by a human being, but critics argue that when the system is providing dozens of targets every minute, it is impossible for a single human being to examine each target in detail. He just trusts the machine and keeps pressing the ‘OK’ button.

Israel’s ‘Gospel’ system is a great success in technology, but it is also a big warning to humanity. This is making war ‘efficient’ but also increasing the risk of making it ‘cruel’.

Source link


Discover more from News Link360

Subscribe to get the latest posts sent to your email.

Leave a Reply

Discover more from News Link360

Subscribe now to keep reading and get access to the full archive.

Continue reading