Bladed ‘Ninja’ missile used to kill al-Qaida leader is part of a scary new generation of unregulated weapons


PTI, Aug 6, 2022, 10:32 AM IST

The recent killing of al-Qaida leader Ayman al-Zawahiri by CIA drone strike was the latest US response to 9/11. Politically, it amplified existing distrust between US leaders and the Taliban government in Afghanistan. The killing also exposed compromises in the 2020 Doha peace agreement between the US and the Taliban.

But another story is emerging with wider implications: the speed and nature of international weapons development. Take the weapon reportedly used to kill al-Zawahiri: the Hellfire R9X “Ninja” missile.

The Hellfire missile was originally conceived in the 1970s and 80s to destroy Soviet tanks. Rapid improvements from the 1990s onwards have resulted in multiple variations with different capabilities. They can be launched from helicopters or Reaper drones. Their different explosive payloads can be set off in different ways: on impact or before impact.

Then there is the Hellfire R9X “Ninja”. It is not new, though it has remained largely in the shadows for five years. It was reportedly used in 2017 in Syria to kill the deputy al-Qaida leader, Abu Khayr al-Masri.

The Ninja missile does not rely on an explosive warhead to destroy or kill its target. It uses the speed, accuracy and kinetic energy of a 100-pound missile fired from up to 20,000 feet, armed with six blades which deploy in the last moments before impact.

‘Super weapons’ The Ninja missile is the ultimate attempt – thus far – to accurately target and kill a single person. No explosion, no widespread destruction, and no deaths of bystanders.

But other weapon developments will also affect the way we live and how wars are fought or deterred. Russia has invested heavily in these so-called super-weapons, building on older technologies. They aim to reduce or eliminate technological advantages enjoyed by the United States or Nato.

Rusia’s hypersonic missile development aims are highly ambitious. The Avangard missile, for example, won’t need to fly outside the earth’s atmosphere. It will remain within the upper atmosphere instead, giving it the ability to manoeuvre.

Such manoeuvrability will make it harder to detect or intercept. China’s DF-17 hypersonic ballistic missile is similarly intended to evade US missile defences.

The autonomous era At a smaller scale, robot dogs with mounted machine guns are emerging on the weapons market. The weapon development company Sword International took a Ghost Robotics quadrupedal unmanned ground vehicle – or dog robot – and mounted an assault rifle on it. It was one of three robot dogs on display at a US army trade show.

Turkey, meanwhile, is claiming it has developed four types of autonomous drones, which can identify and kill people, all without input from a human operator, or GPS guidance. According to a UN report from March 2021, such an autonomous weapon system has been used already in Libya against a logistics convoy affiliated with the Khalifa Haftar armed group.

Autonomous weapons that don’t need GPS guidance are particularly significant. In a future war between major powers, the satellites which provide GPS navigation can expect to be shot down. So any military system or aircraft which relies on GPS signals for navigation or targeting would be rendered ineffective.

China, Russia, India and the USA have developed weapons to destroy satellites which provide global positioning for car sat-nav systems and civilian aircraft guidance.

The real nightmare scenario is combining these, and many more, weapon systems with artificial intelligence.

New rules of war Are new laws or treaties needed to limit these futuristic weapons? In short, yes but they don’t look likely. The US has called for a global agreement to stop anti-satellite missile testing – but there has been no uptake.

The closest to an agreement is the signing of NASA’s Artemis Accords. These are principles to promotes peaceful use of space exploration. But they only apply to “civil space activities conducted by the civil space agencies” of the signatory countries. In other words, the agreement does not extend to military space activities or terrestrial battlefields.

In contrast, the US has withdrawn from the Intermediate-Range Nuclear Forces Treaty. This is part of a long-term pattern of withdrawal from global agreements by US administrations.

Lethal autonomous weapon systems are a special class of emerging weapon system. They incorporate machine learning and other types of AI so that they can make their own decisions and act without direct human input. In 2014 the International Committee of the Red Cross (ICRC) brought experts together to identify issues raised by autonomous weapon systems.

In 2020 the ICRC and the Stockholm International Peace Research Institute went further, bringing together international experts to identify what controls on autonomous weapon systems would be needed.

In 2022, discussions are ongoing between countries the UN first brought together in 2017. This group of governmental experts continues to debate the development and use of lethal autonomous weapon systems. However, there has still been no international agreement on a new law or treaty to limit their use.

New rules for autonomous weapon systems The campaign group, Stop the Killer Robots, has called throughout this period for an international ban on lethal autonomous weapon systems. Not only has that not happened, there is an undeclared stalemate in the UN’s discussions on autonomous weapons in Geneva.

Australia, Israel, Russia, South Korea and the US have opposed a new treaty or political declaration. Opposing them at the same talks, 125 member states of the Non-Aligned Movement are calling for legally binding restrictions on lethal autonomous weapon systems. With Russia, China, US, UK and France all having a UN Security Council veto, they can prevent such a binding law on autonomous weapons.

Outside these international talks and campaigning organisations, independent experts are proposing alternatives. For example, in 2019 Australia-based US ethicist, Deane-Peter Baker brought together the Canberra Group of independent international. The group produced a report, Guiding Principles for the Development and Use of Lethal Autonomous Weapon Systems.

These principles don’t solve the political impasse between superpowers. But if autonomous weapons are here to stay then it is an early attempt to understand what new rules will be needed.

When Pandora’s mythical box was opened, untold horrors were unleashed on the world. Emerging weapon systems are all too real. Like Pandora, all we are left with is hope.

By Peter Lee, Professor of Applied Ethics and Director, Security and Risk Research, University of Portsmouth Portsmouth (UK), Aug 6 (The Conversation)

Udayavani is now on Telegram. Click here to join our channel and stay updated with the latest news.

Top News

No CM can remain absent for long, it’s against national interest: Delhi HC on Kejriwal

Politics behind sexual abuse charges against me and my son, claims MLA H D Revanna

Heatwave threat: Orange alert issued for 17 districts in Karnataka

PCB finalises Lahore, Karachi, Rawalpindi as venues for Champions Trophy

If voted to power, Congress will conduct caste, economic survey: Rahul Gandhi

‘Will PM still remain silent?’ Priyanka Gandhi slams BJP over Hassan ‘sex scandal’

IAF’s Resurgent Challenge in Pursuing Atmanirbharta

Related Articles More

China trying to undermine Tibet’s identity, want to make world aware about it: Tibetan girl who was jailed for protesting

Don’t blame Dubai’s freak rain on cloud seeding

What role does genetics play in breast cancer? How can genetic testing help with early breast cancer diagnosis?

From Orbit to Earth: ISRO’s Contributions to Understanding Himalayan Glacial Shifts

Modi Supports Philippines with BrahMos Missiles in China Sea Dispute

MUST WATCH

Skin Rash, Causes, Signs and Symptoms

11 bullets found in python’s body!

K. Jayaprakash Hegde Sharing His Memories

Grafting Jack Anil

Heat Illness


Latest Additions

ISRO releases ISSAR 2023 report on vulnerability of space assets to collisions

Hunasagi: MLA Bairati Basavaraj’s car overturned

No CM can remain absent for long, it’s against national interest: Delhi HC on Kejriwal

Cricket for the Blind: A Transformative Journey Empowering Visually Impaired Athletes

China lifts restrictions, gives all clear nod for Tesla cars as Musk lobbies hard in surprise visit to Beijing

Thanks for visiting Udayavani

You seem to have an Ad Blocker on.
To continue reading, please turn it off or whitelist Udayavani.