Pages

4 December 2023

AI, War and Transdisciplinary Philosophy

Nayef Al-Rodhan

Human ego and emotionality play a bigger role in war than we often admit. Human pride, grief, contempt, hate and shame have all changed the course of history time and time again. As AI and human enhancement continue to evolve, they will be used to hack human ego and emotionality, leading to a step-change in the brutality and illegitimacy of war, writes Nayef Al-Rodhan.

The Prussian military strategist Carl von Clausewitz saw uncertainty and fear as essential ingredients of war. But how does human fallibility, which is at the core of classic theories of war dating back to Sun Tzu, play out in a world where AI-powered military technologies remove human qualities from battle? Will emerging AI tools such as deepfakes, and other deceptive technologies, deepen the fog of war? Are these transformative technological developments changing the very nature of war? Will the extreme brutality enabled by highly destructive military technologies create multi-generational hate, vengeance, deep ethnic and cultural schisms and hinder reconciliation, reconstruction and coexistence? These questions have been made ever-more pressing by the current Russia-Ukraine and Israeli-Palestinian conflicts. Will these developments change the very nature of war? These questions are fundamental to the sustainability of human civilisation, here on earth as well as increasingly in Outer Space. To answer them, we need to examine the benefits, dangers and limitations of the new methods of war - and examine how our human nature shapes, and is shaped by, the way we fight.
Contemporary research in neuroscience has provided valuable insights into human behaviour, with direct consequences for state behaviour, cooperation and conflict. Contrary to previous assumptions about the rationality of human behaviour, neuroscience has shown that emotions and emotionality play a central role in cognitive functions and in rational decision-making. Studies show that human beings are neither inherently moral, nor immoral. They are, rather, amoral, and influenced by personal and political circumstances, where their moral compass is governed primarily by “perceived emotional self-interest”.

In short, our appetite for primordial power is part of who we are.

Neuroscience shows that human beings are critically predisposed for survival. This means that we are fundamentally egoistic. It is this evolutionary desire to survive and thrive, with constant competition, mistrust and fear of the other, that inspires the aspiration to dominate others. In short, our appetite for primordial power is part of who we are. Neurochemically speaking, feelings of power are linked to the release of dopamine, amongst other chemicals, in the mesolimbic reward centre of the brain. Dopamine is the same neurochemical that is responsible for feelings of pleasure and rewards, as well as the “highs” of all forms of addiction, including drug addiction, social media and gambling. That is why power has an addictive effect on the brain, comparable to that of a drug. It leads human beings to do anything to seek it, enhance it – and prevent losing it.

Neuroscience has debunked the realist presumption that states are driven exclusively by rationality. Neuroscientific studies demonstrate the neuroanatomical and neurochemical links between emotions and decision-making, which have a profound influence on international relations and a peaceful global order. Emotionality infuses unpredictability into human affairs, and can be at the root of state and sub-state conflicts. Bertrand Russell noted this in his book ‘Has Man A Future?’ published on the eve of the Cuban Missile Crisis. Claiming that humanity was on the verge of annihilation, Russell described how “pride, arrogance and fear of loss of face have obscured the power of judgment” of Kennedy and Khrushchev, the leaders of the United States and Soviet Union. Much like humans, states are egoistic and survival-oriented and are heavily influenced by interests and perceptions.

The emotionality of states has played a determining factor in both inter- and intra-state conflict throughout history. Take, for example, Stalin’s fatal foray into Korea, which historian Tony Judt has described as a result of his growing paranoia and suspicions about Western plans. Or Napoleon’s invasion of Russia, which was arguably driven more by pride and hubris than by cold strategic calculation. More recently, the strategically unsound - and illegitimate - invasion and destruction of Iraq and dismantling of Libya had similar emotional undertones. Or the reheating of the Israeli-Palestinian conflict with its deeply emotional undertones and the unprecedented humanitarian crisis.

History teaches us that states will weaponise everything they can in order to dominate others.

These examples demonstrate that the human ego, sensibilities and our emotional repertoire consisting of emotions of policy makers such as pain, pride, grief, contempt, hate and shame play a more pervasive role in seemingly accountable state conduct and international relations than is often acknowledged. With an eye on the future, these attributes are a timely reminder how emotional attachment to exploitative hegemony, deceptive manipulations, arrogance of power and greed can lead to the illegal acquisition of land and resources. As a result, they can spark conflicts and infuse unpredictability and longstanding mistrust into international affairs, in all political systems. This is especially the case where there are few or no systems in place to keep policy-making in check.

As geopolitical tensions heat up, there is a growing danger of emotionally-tinged self-identity being weaponised through the means of strategic culture, the attempt to integrate cultural considerations, historical memory, applied history and their influences in the analysis of states’ security policies and international relations. History teaches us that states will weaponise everything they can in order to dominate others. That is why, going forward, we cannot ignore neuroscientific findings about the emotionality, amorality, and egoism of human nature and state behaviour when examining new technologies, norms and innovations. Together, they will play a crucial role in our efforts to end conflict by weeding out double standards in inter-state relations and increasing levels of respect for the sovereign choices and national interests of states. This will improve the chances of achieving equitable and sustainable peace, security and prosperity for all that is rooted in trust.

AI will play an increasingly important role in warfare in the coming years. There are those who argue that AI could make war less lethal and possibly strengthen deterrence, i.e. the lives of soldiers could be spared by expanding the role of AI-directed drones in the air force, navy and army. Russia is currently testing autonomous tank-like vehicles and the U.S. Defence Department is training AI bots to fly a modified F-16 fighter jet. However the need for human intervention is likely to decrease, raising ethical and accountable governance questions. A fundamental question in this regard relates to the attribution of responsibility for transgressions by automatic or semi-automatic systems. Attribution is more complex in the case of autonomous weapons compared to that of human beings, since the programmer, manufacturer, and commander might all be held responsible. Although human beings themselves cannot be trained to respond to all possible scenarios, previous experiences help us react to unpredictable situations. The law of armed conflict is based on two fundamental principles, the principle of distinction, which requires combatants to distinguish between military and civilian objects, and the principle of proportionality in the use of force. Unlike a human being, any decision of this kind made by even a highly sophisticated autonomous weapon would be based solely on algorithms governed by probabilistic calculations and predetermined attribution of value. This combination of issues gives rise to a so-called “responsibility gap” for autonomous weapons which, at present, is far from being resolved.

Recent studies also show that AI-driven software could force military commanders to reduce their decision-making window from hours or days to minutes. There is a real danger that decision-makers become over-reliant on AI tools – which operate at much faster speeds than humans – as part of their command-and-control armoury. There is also a real danger that AI technology could equip rogue actors with the brainpower and tools to build dirty bombs or pinpoint nuclear arms sites as a lot of the data is held by private companies which could be susceptible to hacking and espionage. The current war between Russia and Ukraine and the Israeli-Palestinian conflict also remind us how AI tools, such as deep fakes and other sophisticated technological tricks, are increasingly being used to amplify and bolster propaganda efforts. This is being made easier by the rapidly evolving sophistication of AI generators that can produce persuasive fake images and videos. As a result, we are also seeing a so-called liar’s dividend, i.e. a growing proportion of the public is dismissing genuine content from the frontlines as fake.

Recent studies also show that AI-driven software could force military commanders to reduce their decision-making window from hours or days to minutes.

Their ultimate goal is to create “super soldiers” that are stronger, agile and cost-effective. The search for performance optimisation of soldiers through human enhancement is not entirely new and stimulant drugs have been used in the army for decades. During World War II, Japanese, American and British forces consumed large amounts of amphetamines to boost alertness and physical endurance. In the Vietnam War, which was later dubbed the first “pharmacological war” because of the high consumption of psychoactive substances by military personnel, the U.S. military supplied soldiers with speed and steroids. The reckless use of pharmaceuticals and stimulants in the Vietnam War resulted in a large number - estimates range from 400,000 to 1.5 million - of PTSD cases among veterans.

However, these days human enhancement technologies go even further. They can increase soldiers’ muscle strength and alertness while managing pain and stress levels. The quest to create ‘super soldiers’ creates a host of ethical and philosophical concerns linked to the development of authenticity, accountability, free will and fairness, amongst others. This begs the question: will these new techniques redefine what it means to be human? Will ‘super soldiers’ retain the aspects of their personality that make them human? How will the ability of enhanced soldiers to tolerate pain impact issues such as torture and the Geneva Convention? What is clear is that these innovations in the military space are bringing humanity to the brink of transhumanism. They are radically different from previous eras, as they are much more potent, invasive and potentially irreversible. We are now witnessing the rise of technologies that alter human biology by incorporating technology within the human body. Projects spearheaded by DARPA and others include computerised brain implants and biomedical tools that equip soldiers with increased stress resistance, “accelerated learning” capabilities as well as improved immunity from injury and the effects of sleep deprivation. These technologies mark a new phase in the mission to create ‘super soldiers’. These technologies mark a new phase in the mission to create super soldiers. Recent advances in neural integration bring about the real possibility that advanced technology could be plugged directly into the peripheral nervous system, for example via a remote-controlled micro-processing chip implanted beneath the skull. Neuro-stimulation of the brain through Transcranial Direct Current Stimulation (TDCS), using a constant, low current delivered via electrodes on the head, has been found to accelerate learning and improve recall among Air Force pilots.

This begs the question: will these new techniques redefine what it means to be human?

By optimising these technologies, the world’s leading militaries may soon be able to go a step further and pre-programme the reactions, responsiveness, and emotionality of their soldiers. Some of the most radical and profound changes for the human condition will take place through such interventions. These developments could give rise to a form of transhumanism that will challenge the very notion of the human condition as fixed, rooted and constant. Deeper integration of technology within the body, as well as the use of neuro-technological and neuropharmacological means of enhancing our bodies could affect how we feel and think – and therefore also how we act on the battlefield. While enhancement may boost cognitive and physical capabilities, they also diminish some deeply human features like compassion and empathy, that have been pivotal to us as a species, both for survival and cooperation. This could have dire consequences on ethical and humanitarian calculations during combat, including the use of torture. It could also have far-reaching implications on diplomacy and statecraft. Indeed, in the not-too-distant future, the existence of robots or sophisticated humanoids with advanced moral competencies could transform security dynamics, civil-military relations and how we regard ourselves as humans. To navigate this uncertain future, leading thinkers focused on the ethical implications of new types of warfare will need to add transdisciplinary tools to their intellectual armoury. They will need to engage directly with issues that lie on the cusp of AI, synthetic biology, neuroscience, and philosophy (an area I have termed Neuro-Techno-Philosophy). Transdisciplinary endeavours such as Neuro-Techno-Philosophy can teach us a lot about human frailty and malleability, both at the individual and group level. By understanding our neurochemical motivations, neurobehavioural needs, fears and predilections, and the neuropsychological foundations underpinning the behaviour of states, we are better placed to navigate the challenges posed by contemporary geopolitics and global security. These insights could also bolster conflict resolution efforts, which often incorporate behavioural models but, to date, rarely include neuroscientific insights.

On the battlefield, interventions to make soldiers feel less empathy and fear will effectively rewire the human condition and disrupt millennia of evolution. They will also have serious implications for how wars are fought. Given the potential effects of these technologies on emotions as well as physical capabilities, the level of brutality in warfare is likely to increase, severely impeding post-conflict reconciliation and reconstruction efforts. Enhanced weapons, super-soldiers and new biological weapons will fall outside the existing ethical, customary, and legal norms of warfare that are defined by international law and the Geneva Conventions. This will raise important questions for lawyers and policy-makers, not least about questions of responsibility. For example, who will be held accountable if the “enhanced” soldier runs out of control: the soldier, the engineer or the medical teams that enhanced him?

More broadly, questions of law, international competition ethics and potentially uncontrollable cascading risks will become more prominent as states and societies respond to the challenges posed by new disruptive technologies. This is especially true with regard to the possibility of self-evolving run-away AI weapon systems, which are becoming increasingly tangible. These systems could potentially rewrite their own source code and become completely beyond human control and oversight. Unequal levels of access to new technologies will be reflected in international competition and shifts in balance of power, with countries with better integration capabilities possessing an advantage. Military history teaches us the importance of integrating technology. Going into World War II, France had the better tank – but the Germans gained the upper hand by successfully integrating their model with the radio and air cover. Looking to the future, the asymmetry of capabilities is likely to once again exacerbate the sense of extreme brutality and illegitimacy in war.

No comments:

Post a Comment