Britain Aiming To Create Golem Soldiers – Leonid Savin

Fairly recently, the UK Ministry of Defence, together with the German Federal Ministry of Defence, published a curious document. Entitled “Human Augmentation – The Dawn of a New Paradigm”, it focuses on the possibilities of technologically enhancing human abilities to increase soldiers’ combat functions, and it is not the first time that the UK military has developed such a concept. Before this, there were Joint Concept Note (JCN) 1/18, Human-Machine Teaming and JCN 2/17, Future of Command and Control. The document in question was prepared as part of the UK Ministry of Defence’s technology programme Global Strategic Trends: The Future Starts Today and Future Operating Environment 2035, which was launched in 2018.

The US has also been working on this issue for a long time. The idea of enhancing human capabilities for military purposes was discussed in a special study prepared for the US Air Force back in 1962. The Pentagon’s DARPA has been implementing such programmes for years, and it has become the norm for the US military, as well as the corporations and scientists that serve it.

Usually, they talk about three related types of augmentation (enhancement): physical, cognitive, and biological.

Physical augmentation covers prosthetic and assistive devices, such as exoskeletons, and sensors that add sensory functions. Next is cognitive augmentation, which could include invasive brain computer interfaces, and neurostimulation devices that directly cause changes in the brain (using electrical pulses, magnets, and ultrasound). Then, finally, there is biological augmentation, which covers gene editing, pharmaceutical drugs, and new types of vaccines.

People, the document argues, should be thought of as “platforms” in the same way as vehicles, aircraft and ships are, and these “human platforms” have three elements that must be developed: physical, psychological, and social.

The joint British and German paper defines human augmentation as “the application of science and technologies to temporarily or permanently improve human performance.” A distinction is then made between human optimisation, which can “improve human performance up to the limit of biological potential without adding new capabilities”, and human enhancement, which can take people “beyond the limit of biological potential”. Noting that night vision goggles and binoculars should be technically included in the definition of human augmentation, the paper states that it will focus on “the implications of novel science and technology that are more closely integrated with the human body.”

New Paradigm

                           Human Augmentation – The Dawn of a New Paradigm

“We want ‘war fighters’ – whether they be cyber specialists, drone pilots or infantry soldiers – to be stronger, faster, more intelligent, more resilient and more mobile to overcome the environment and the adversary. … As technology has become more sophisticated our thinking has become more focused on the machine rather than the person, but this needs to change if we are going to be effective in the future”, the paper says.

Although it states that “[a]dvances in artificial intelligence, robotics and autonomy mean that human processing power, speed of action and endurance are being rapidly outpaced by machines”, it recognises that machines “have weaknesses of their own.” It is assumed that people have “an advantage in the areas of creativity and judgment,” but the paper also argues that human enhancement is necessary in order to take greater advantage of the advances in these areas.

“The winners of future wars will not be those with the most advanced technology, but those who can most effectively integrate the capabilities of people and machines. … Human augmentation represents the missing part of the puzzle”, say the authors in justification of their concept.

After outlining the basics of human optimisation, such as sleep, nutrition and dietary supplements, the paper moves on to the concept of “high-end augmentation”. It describes and discusses four “core human augmentation technologies” that are crucial to the study of this area: genetic engineering, bioinformatics, brain interfaces, and pharmaceuticals. But how exactly will all this be used for military purposes?

The paper argues that the deployment of force is being increasingly challenged by the proliferation of precision, long-range weapons; therefore, the solution is to make greater use of unmanned systems in conjunction with lighter, more mobile, and more versatile ground forces. Through brain interfaces, personnel will be able to “increase the combat power they can bring to bear by networking them with autonomous and unmanned systems.”

The growing use of computers and artificial intelligence in warfare also means that “[t]he cognitive load on personnel is likely to increase, particularly for those involved in command and control.” It suggests that “bioinformatics” – the study and analysis of large volumes of biological data – can help during the recruitment phase by “identifying commanders and staff with the right aptitude for command and control roles.”

The paper states: “Brain interfaces, pharmaceuticals and gene therapy could all play a significant part in optimising and enhancing command and control proficiency. In the short term, non-invasive brain interfaces could improve performance by being used to monitor cognitive load, develop better processes and improve training. … In the longer term, brain interfaces could network brains within a headquarters providing a completely shared operating picture, improving the quality and speed of decision-making.”

But the most important section – how ethical such an application is – is superficial and fairly short.

It states that it will not be addressing the broader ethical issues of human augmentation because “they rightfully continue to be the subject of wider debate”. There are only clear arguments in favour of military use. In particular, it says that “[d]efence […] cannot wait for ethics to change before engaging with human augmentation, we must be in the conversation from the outset to inform the debate and understand how ethical views are evolving. … [W]e cannot assume human augmentation will be automatically effective or accepted in its intended use, no matter how beneficial its effects may be. Human augmentation may be resisted by elements of society that do not trust the effectiveness and motive of the augmentation.”

                                                                                   Golem soldiers

Indeed, there are many people in the UK, Germany, and other countries who will undoubtedly oppose such “augmentation”. But here, the British follow their own classic logic, pointing out that military developments in this area should not wait for public consent or ethical debates, but should be “based on the national interests in terms of prosperity, safety and security.”

“The imperative to use human augmentation may ultimately not be dictated by any explicit ethical argument, but by national interest. Countries may need to develop human augmentation or risk surrendering influence, prosperity and security to those who do,” the paper says.

What’s more, it argues that “[r]elationships with industry and academia will be key to understanding how emerging human augmentation technologies could be repurposed or developed for Defence.” The authors suggest a re-examination of the relationship between the military and the government departments responsible for health and social care, and a move towards “a more sophisticated relationship between the public and private sector.” In other words, an alliance is justified between techno-corporations, scientists, the government, and the military, while the general public remains on the sidelines, as if these issues do not concern it.

Anticipating people’s outrage, the authors write: “Successfully exploiting human augmentation will require Defence, and society, to face up to uncomfortable ethical and legal dilemmas. So far, Defence organisations in liberal democracies have adopted a ‘wait and see’ approach, choosing to let ethical debate and technical developments play out. This passive stance will cede momentum to our adversaries and cause Defence to miss opportunities to improve the well-being and effectiveness of our Armed Forces.”

In a review of the paper, Chris Cole writes: “The argument, as this report makes, that we are weak and ineffective in the face of sophisticated and deadly enemies is far from a new one. It has been used for centuries to develop and sell tools to increase our lethality and reach in order to project deadly force around the globe. But there is a qualitative difference between equipping a soldier with night-vision goggles or a high-powered rifle and implanting a computer interface in the brain of a drone pilot in order to increase data processing … That is before we get to the idea of somatic gene engineering to reduce pain thresholds or increase cognition.”

If we look below the surface, then humanity itself is called into question, because the human “augmentation” is being done under the guise of eliminating certain flaws in order to become more lethal and to better carry out organised violence. Isn’t this a strange reason for “augmenting” human capabilities? Not to mention traditional religions, who only see such “augmentation” as a way to increase the evil in the world.

 Geopolitical analyst, Chief editor of Geopolitica.ru, founder and chief editor of Journal of Eurasian Affairs; head of the administration of International Eurasian Movement.

Read More

Leave a Reply