The last few days have been worrying for world safety, with the US taking military action against Syria, an ally of Russia. Russia has responded to the use of Tomahawk missiles on a Syrian airfield by saying ‘What America waged in an aggression on Syria is a crossing of red lines. From now on we will respond with force to any aggressor or any breach of red lines from whoever it is and America knows our ability to respond well.’ Essentially, ‘if you do that again we’ll do it right back.’
This is a terrifying prospect, not only because in under 100 days as President, Donald Trump has essentially reopened the Cold War wounds, but also because between them the US and Russia have two of the most dangerous and advanced militaries in the world. The size of an army and the kind of weapons use in modern warfare are considerably different to the Cold War period though, and a big part of this is the effective use of data.
It may sound odd to think that algorithms and numbers could be at the front line of military power, but the reality is that without them many of the most advanced and important weapons of the past 20 years would not exist. High precision weapons, like the Tomahawk missile, work so effectively because of how they interpret data, otherwise, there is little difference to the ordinance used 40 years ago. So how exactly is data impacting on warfare?
The use of computers to essentially attack other computers has traditionally been thought of as something relatively benign, but in an increasingly digital world, and especially in a modern military, it is as effective as any gun.
Looking at the recent American use of Tomahawk missiles in Syria, some analysts believe that the reason this could take place without the Russian missile defense system stopping the missiles is because they were hacked beforehand. Given that Russia was warned ahead of time that this strike was going to happen, it would be surprising if they had turned it off themselves. If one side can hack the other’s defenses, they are metaphorically opening the gate through which the army can march.
There have also been numerous reports of military equipment being hacked, such as the reported Israeli hacking of the Syrian radar system in 2007, which apparently managed to turn it off completely. Similar reports have emerged of a German missile battery being hacked in Turkey, and even the RAF have claimed that Russia has hacked their planes during recent flights in Syria, forcing missions to be canceled.
Having the ability to hack these devices to either steal the data, change mission data, or to disable weapons, has the potential to be incredibly valuable for any opposition. After all, it would be easy to win a war if you can use your enemies’ own guns against them.
We briefly discussed how advanced missile systems rely on data for their targeting systems to work effectively, but in modern warfare, it is a wide variety of weapons that utilize data to perform effectively.
One of the simpler versions is the Armatix iP1, which is designed to work only if a fingerprint verified watch is within 25cm of the gun. It means that firstly, guns cannot be used against the owner in a combat situation and also cannot be stolen and used elsewhere. TrackingPoint is another company utilizing complex data technologies and sensors to create more effective weapons, with the ability to ’tag’ a target, automatically zero the gun, identify the type of target being aimed at, and even allows users to aim using a HUD. Similarly, the XM25 is a grenade launcher that uses data to calculate distances and trajectory automatically, making it possible for soldiers to hit enemies hiding behind cover. However, there are currently issues with manufacture as a producer of some key components, Heckler & Koch, are unsure whether it would be legal to use it in war given that it illegal to fire a grenade directly at an enemy.
Similarly, missile defense systems are totally reliant on a huge amount of data to function and have the ability to hit missiles flying at over 500mph. Everything from trajectory, wind, height, and speed need to be calculated in milliseconds in order to be effective, which requires a huge amount of data and computing power to work effectively.
AI and machine learning are key in modern warfare, with military equipment like unmanned aircraft almost impossible without it.
However, it has some significant other uses, although there are clearly ethical issues surrounding its use in the field. It is not something that the military is unaware of either, with the Defense Advanced Research Project Agency (DARPA) reportedly one of the biggest funders of AI projects in the world.
For instance, AI has been used to help in the planning of missions for a long time, with missions as far back as Desert Storm being planned with the help of the technology. It is also key to improving simulations for the training of military personnel, creating more realism and using the tactics that can be learned from previous enemy actions. The US army is also utilizing fairly standard drones that you could pick up for a few hundred dollars on Amazon but then retrofitted with advanced AI systems to help identify targets. This allows soldiers to know either where enemies may be before they arrive at a location, reducing the chances of loss of life. All of this is done completely autonomously too, so it can just be released and fed back to soldiers, rather than requiring any kind of manual control.
However, despite the ethical questions surrounding its use in the field, there are some recent examples of its use, such as the autonomous Ford F-350 trucks being placed by the Israeli army along the Gaza border. This has some clear issues and the UN seem to have been debating the issue of ‘killer robots’ for several years, with many believing that it is only a matter of time until they are banned from actively taking part in warfare. The issue with the use of AI in this way is simply that Humans need to make a decision about danger that has come from thousands of years of evolution and reaction, something that AI will always struggle to achieve.
Despite this, the Pentagon has budgeted $18 billion over the next 3 years to help in the developments of ‘autonomous weapons’ with the argument being summed up by the New York Times in a discussion with a military leader:
‘China and Russia are developing battle networks that are as good as our own. They can see as far as ours can see; they can throw guided munitions as far as we can,’ said Robert O. Work, the deputy defense secretary, who has been a driving force for the development of autonomous weapons. ‘What we want to do is just make sure that we would be able to win as quickly as we have been able to do in the past.’
There is certainly a slippery slope for AI in this kind of use, but perhaps the way forward needs to be focussed on defense and preparation rather than offense and assault.