How Terrorists and Technology Will Shape Future Wars

Liran Antebi, who researches war, discusses the impact of terrorists and technology on current and future battlegrounds.

Ayelett Shani
Send in e-mailSend in e-mail
Liran Antebi.Credit: Gali Eytan
Ayelett Shani

Who: Liran Antebi, 33, research fellow at the Institute for National Security Studies and a doctoral candidate at Tel Aviv University, lives in Tel Aviv. Where: INSS, Monday noon.

On the one hand, you study and research war, yet on the other you do reserve duty in the air force in an operational capacity. You also took part in Operation Protective Edge in Gaza last summer. How do you combine all these things?

It’s emotionally difficult – another round of fighting, yet another war after having dealt with the same problem only two years earlier.

So what’s changed?

We’ve made improvements, but it’s not enough. The other side is making very rapid progress. They share information and learn from what takes place in other arenas, since everything is exposed and on the Internet – one can view a war in real time. It’s frustrating but it also gives me strength.

My research began after the Second Lebanon War [2006] and Operation Cast Lead [2008], stemming from a possibly nave wish to perhaps find a better way to protect the people I love, so that they don’t ever have to be on a battlefield, so that robots are there to solve our problems for us.

What is common to robots and your field of study – political science?

I am not a robotics person and have no technological background, but I study and write about the development of military systems and unmanned aerial vehicles. In my doctoral thesis, for example, I studied the impact of robots on current and future battlegrounds. I created a model that describes the paradox of power – namely, the difficulties democracies have in combating terrorism and guerrilla warfare. I studied whether using technology to replace soldiers on the battlefield could change things in that regard.

The paradox of power, as I understand it, is that the more military power a state possesses, the harder it is for it to combat non-military forces – forces that are supposedly weaker.

This is a complex issue, since the difficulties don’t stem from the actual balance of power between the sides. In nature, we expect the stronger side to vanquish the weaker one swiftly and with ease, and here things are different: The powerful side encounters difficulties in defeating someone with far less power, since the weaker side resorts to modes of operation that compensate for its limitations. These methods are ones that a liberal democracy must eschew. This is plainly obvious in Israel’s struggle with Hamas.

Hamas can’t contend with Israel as one military force confronting another on the battlefield, since the organization is rarely present on an actual battlefield in the classical sense: Its soldiers don’t wear uniforms and they place themselves among civilians. In these circumstances, the advantages a democracy has over an enemy that resorts to subterfuge lie in economic and technological superiority.

Instead of exploiting its power on the battlefield, it employs it in other areas. Can you illustrate how this advantage is expressed in Israel’s war with Hamas?

Yes. In the 1990s, suicide bombers appeared on the scene. Israel used its technological superiority to locate, warn against and eventually stop these attacks before they took place. The next phase was rocket fire. This also led to a solution.

The Iron Dome system?

Correct. As it stands now, Hamas continues firing rockets, but without making significant gains. One can see them searching for new methods, probably via underground tunnels. Estimates are that this will also happen in the north, in the next round of fighting in Lebanon. Hezbollah is also preparing the next thing.

Low- and high-tech

What is the next thing?

My assessment is that they will use technologies that used to be available only to democracies, but are now easily obtainable by anyone, such as unmanned aerial vehicles. This isn’t just limited to what they get from Iran, the Ababil drones, which are nothing more than flying barrels of explosives, but also a lot more sophisticated yet cheap equipment. We see this happening with Islamic State, which is making widespread use of drones that cost only $1,400 each. Anyone can purchase those on the Internet without being monitored.

What do terrorist organizations use these drones for?

It’s a combination of low-tech and high-tech. For example, they build a tunnel, move these apparatuses through it and operate them on the other side. They can also map a military base from above, and then attack it and execute people by beheading them. We see this combination at work with most terrorist organizations around the world.

[Former U.S. Defense Secretary] Donald Rumsfeld wrote a highly publicized article in a journal related to defense studies, in which he argued that the clever thing is not to strive for the next revolution but to know how to combine low-tech and high-tech. He describes a battlefield scene in which a knight wages war along with the most advanced robot there is. The ones utilizing such combinations at present are the terrorist organizations.

Technology gives them an advantage, since they are less cumbersome and bureaucratic than regular armies.

That’s true, and it demands that we think creatively and exploit our technological superiority so as to reduce the impact of the power paradox. Drones allow us to improve our information-gathering, to better control an area, and to use force while distinguishing between civilians and combatants. They allow us to overcome some of the challenges that result from the power paradox. The task is not just to develop the most advanced technology, but to use it wisely. We should think the same way they do in sci-fi movies.

What do you mean by that?

We already have the means to look around a corner or through a wall, aided by hand-sized robots which can be thrown into a room and transmit pictures from inside, often without the enemy even noticing their presence. This can be taken a few more steps forward – for example, by using a robot that looks like a fly.

Unmanned aircraft can be a threat not only on the battlefield, just like the one that landed on the lawn of the White House in January.

A year and a half ago, we saw German Chancellor Angela Merkel giving a speech, during which a tiny aircraft costing a few hundred dollars landed right beside her. Her bodyguards pounced on it and smashed it. Over the last year there have been nightly flights of unmanned vehicles over Paris and no one knows who is operating them. They were seen flying over nuclear reactors and no one knows what they are or why they are there. These could be terrorist organizations collecting information about strategic sites.

There is no way of knowing.

Yes, this is a problem against which we can try and protect ourselves via regulation and by monitoring. Technology has leaped forward, but the law and regulatory practices have lagged behind. We face a huge challenge here. How do we stop the drone operators? How do we stop their being used to commit crimes? Take, for example, how things are smuggled into prisons. We’ve already seen several such drones crash, carrying drugs or weapons. What if someone with malicious intent puts a toxic chemical, such as an over-the-counter drain-opener, on one of these devices and then disperses it over people from the air. That would constitute a terror attack, but are we prepared for it?

In Israel, the problem is more difficult since the aerial space is very crowded and under constant threat: This poses great challenges. We have no response to such scenarios at present. There is no system the state can buy that will assure it that the flight of unmanned aerial vehicles is under control. We haven’t prepared for that.

What about attack drones?

This technology was in the hands of only a few and particularly advanced countries. Solid research shows that in 10 years, any country will be able to purchase an unmanned drone with attack capabilities.

This changes all the rules of engagement in the international arena.

Correct. What does a breach of sovereignty mean vis-a-vis such vehicles? The Americans, for example, constantly violate other countries’ sovereignty with such drones, in the name of the war on terror. In Yemen, for example.

This works against liberal democracies, since they are signatories to treaties and have international commitments. But terrorist organizations or totalitarian states don’t care about the United Nations. They can purchase 1,000 such drones and attack at will.

I’d like to believe that the international system is wiser, and will manage to limit this phenomenon. The main challenge in this regard right now is China, which has set a goal of building a flotilla of such drones. There have been rumors that China is about to sell some unmanned aerial vehicles to Jordan, a customer to which no other country has been willing to sell to until now. We wonder who else the Chinese will sell to just so they can beat the Americans.

Swarms of systems

Let’s talk a bit about the way the rules of war that were applicable to the 20th century have changed, about the transition from remotely controlled devices to ones that are autonomous, not requiring an operator.

Two years ago, we conducted a large study at our institute, related to technological forecasting. It was based on surveys carried out by experts around the world. The conclusion was that within 20 years, autonomous systems will be capable of planning and carrying out tactical military missions that are currently performed entirely by humans. These systems will be capable of acting in concert, in what is called swarms – rapidly exchanging precise information that will enable them to plan and act accordingly.

Group thinking in artificial intelligence?

Yes. These are things we learn from nature and apply to robotics. The battlefield will undergo a total transformation, since humans will be involved only at the strategic level, if they so desire. Wars will still involve trying to exact a price from your enemy in blood, economics, infrastructure, using any means until he yields to your wishes, or vice versa. What will change is the way this is achieved. Autonomous equipment will change many things in the actual fighting. The challenge with this is to maintain accountability – namely, to decide who will have to account for operating such equipment.

Yes, I suppose it will be impossible to indict a programmer who developed a system five years earlier, only because one day his system bombed Kfar Sava.

The system he develops may be one that keeps “learning.” In his definitions, he prohibited it from bombing Kfar Sava, but something may have happened there in the meantime, making the system change its ‘mind’ and deviate from the basic program that was initially installed in it. This inherent challenge of autonomy and artificial intelligence that keeps developing in an unsupervised manner, is worrying even people who can’t be accused of techno-phobia, such as inventor and entrepreneur Elon Musk, Stephen Hawking, Bill Gates.

When we presented our work on autonomous military systems and recommended some pertinent policies, senior military officials told us we were living a fantasy. For me it was like the cavalry refusing to acknowledge the appearance of tanks and machine guns on the battlefield. When a senior fighter pilot tells me that an unmanned aerial vehicle is irrelevant, since it will never be able to conduct dogfights like a human pilot can – I understand that this is something personal, emotional, an unwillingness to become irrelevant. This is really an important discussion, and the state won’t engage in it.

Assuming this discussion is eventually held, can these technologies be reined in at all?

Unfortunately, the answer is not clear. In terms of their overall impact, these technologies resemble debates on the environment. In order to limit them, you need big and sweeping measures, something like the Kyoto accords. On the other hand, it’s very difficult and probably impossible to put a lid on these technologies, since their economic potential is vast, and I can’t imagine a leader of a country with a technological orientation signing an accord that will prevent the next leap forward for humanity.

There already is artificial intelligence that learns and develops. I hope that those who apply it know in which direction it’s developing. But can they be sure they know? We’re a long way from fully understanding this challenge. One could perhaps draw a parallel to the way in which nuclear weapons were perceived a few decades ago: To grasp what we were dealing with it was necessary to bomb two cities in Japan.

I hope we don’t reach a similar situation with artificial intelligence. In addition to limiting an artificial intelligence that can learn and change, we should remember that these are computer systems, by which we’ll be increasingly surrounded in the future. It will become easier to hurt people. The “bad guys” will be capable not only of knocking out power stations but also, for example, of shutting off the air supply for thousands of people. The UN wants to limit autonomous weapons systems because they carry weapons. But what about limiting unarmed systems? People normally work in offices or in their gardens, and only when war breaks out do they resort to weapons. Can we be certain that an autonomous robotic system located in an office or a factory won’t receive a command one day and pick up a weapon?

Will weapons shape future wars?

Not the war itself, only its environment.

Including the actual fighting?

The phenomenon of war is older than that of states.

Yes, but wars in the 21st century can bypass the state and grant power to non-state agencies.

We don’t know what the next big war will look like, if it happens at all. Perhaps technology will bring us to a place in which it is not worthwhile. Einstein summed it up in one sentence that is still relevant today: “I don’t know what the third world war will look like, but the fourth one will be fought with sticks and stones.”

Comments