We use cookies to ensure our website operates correctly and to monitor visits to our site. This helps us to improve the way our website works, ensuring that users easily find what they are looking for. To allow us to keep doing this, click 'Accept All Cookies'. Alternatively, you can personalise your cookie settings.

Accept All Cookies Personalise settings

TechWatch Live: Human augmentation

DEEP DIVE

Human augmentation: the dawn of human 2.0?

The idea of artificially enhanced cognitive or physical capabilities as a source of advantage is neither new, nor different. But the last few years have seen major progress in the commercial world and within defence, in particular by our adversaries, with China ‘actively exploring’ these techniques. As such, defence and security must begin to prepare themselves for a very different future.

The UK government defines human augmentation (HA) as ‘the application of science and technologies to temporarily or permanently improve human performance’. And the US, the Space Force Chief Scientist has recently deemed its development as ‘imperative’.

We have all heard the much-touted benefits of autonomous and uncrewed systems – but there remains the need for critical responsibilities to stay in the hands of an ‘in the loop’ human. This requires a credible human-machine relationship, based upon a reliable interface, which is one of HA’s many potential offerings. And whilst there are many other ways in which a human can be augmented, in this piece we explore some more likely forms of HA, what the concept means for defence and security and the ethical quagmire at its foundations.

A glimpse into the future

HA systems have been part of the conversation in defence and security for decades now: DARPA requested research into it as far back as 2000. And it’s been a theme in science fiction for even longer. But HA isn’t just about ‘Iron Man’ style exoskeletal systems or genetically-engineered supersoldiers, the more ‘invasive’ elements. It can also involve ‘non-invasive’ practices (not affixing to the human body or physically altering it), like the development of drugs that can enhance human biological functions. And the commercial world is making significant strides across the board.

In 2020, Elon Musk’s Neuralink unveiled Gertrude, a pig with a computer chip in her brain - demonstrating the company’s intentions to create a working brain-to-machine interface. Elsewhere in California, neurologist Theodore Berger is exploring the possibility of synthetic memory chips, which would allow humans to access memory with greater speed and accuracy. Similarly, a project led by the US Air Force Research Laboratory (AFRL) is aiming to speed up the learning process, such that airmen can more quickly acquire knowledge and skill.

And smaller devices offer even bigger possibilities. With the current public interest in vaccines, the future could see nanobots being used for precise interactions with cells within the body, performing tasks that the human immune system cannot manage on its own. This could also include protecting personnel from chemical or biological attacks. One significant driver of HA research is helping individuals reclaim functions that have been lost due to injury or illness, which would also have wider defence applications.

Such technology could also be applied adjacent to the body. In the civil world, we attach sensors to ‘wearables’, like watches, using the data we gather to enhance our performance. Widespread uptake in defence and security arenas appears inevitable. For example, the British Army has been using wearables to see how recruits respond to physical training, and the US Air Force is exploring wearables and biofeedback as a means to make troops ‘mentally tougher’.

Recognising our differences

It’s hard to cover all that HA could offer for defence and security. As such, militaries across the world are still trying to understand where to focus their R&D. For example, is it for those who are slower, weaker or less capable? Or is it for creating an elite, building up the strongest and taking them even further?

It’s perhaps ironic that such discussions take place against the backdrop of rising numbers of British troops being deemed overweight, with thousands failing fitness tests. Clearly there are ‘low hanging fruit’ like nutrition, sleep and exercise to be addressed first.

Conflicting ethical standards

Defence and security is unique in that it cannot publically do research in an area, without stimulating adversaries’ ‘counter’ research. In 2018, He Jiankui, a Chinese biophysicist, created the world’s first gene-edited babies using the tool CRISPR–Cas9. Despite the condemnation of his research, and his later imprisonment, the birth of the twins represents a new technological precedent for HA.

Such techniques are particularly unpalatable to the West. But if adversaries push ahead regardless, and as the commercial and civil worlds continue to progress, there is, perhaps, little choice in the matter. As Florence Parly, France’s Defence Minister, has said: ‘...we must face the facts. Not everyone shares our scruples and we must be prepared for whatever the future holds’.

It will be an arms race, whether we like it or not, and it means that counter human augmentation measures must be considered alongside the uptake of the technologies. It’s time to spend more energy on the actual risks and applications of HA, rather than engage in the same ongoing speculation that has ultimately led nowhere. The alternative is for the West to find itself at a significant disadvantage when adversaries begin to field HA technologies.

Progress here still relies upon a deep understanding of the data provided from the host of augmentation technologies. Defence and security will want to understand the system’s strengths, weaknesses, as well as exactly what the data is telling us. Explainable AI will be part of the solution here, allowing the humans to understand decisions being made, injecting that crucial element of trust.

The fine line

HA is developing at pace, with a number of technologies already at human trials stage. There are so many questions around what we want to be able to do with HA. What ethical lines will we absolutely not cross? What about informed consent from augmented military personnel? And how can we identify a ‘dual use problem’, ie. when scientific research has the potential for harm as well as for good, such as controlling pain? Much is being discussed on this at the moment.

Human augmentation has reached a point at which human capability could be significantly enhanced over the next 20 years, with higher levels of physical or cognitive performance across the population. Decisions are now being made as to what individuals might accept, and what we as a society will accept, and how to keep abreast of developments by our adversaries, whilst maintaining ethical boundaries.

With a broad range of options to consider, from genetic engineering to technology adaptations, now defence and security have the opportunity to take human augmentation that step further (literally).

News item