-
Aurora Provider Network Refresh Opens
12 Jun 2025
-
QinetiQ welcomes top Green Tech company to test pilot school
29 May 2025
-
Future Technologies of Underwater T&E – Part 3
13 May 2025
-
VE Day 80th Anniversary
08 May 2025
-
Rising cyber threats and workforce gaps
29 Apr 2025
April 2024: Futures Lab - Cutting edge technologies and research influencing Defence
Mistral AI ran a large language model (LLM) hackathon based on the classic Street Fighter game. Fourteen LLMs were pitched against each other in a Street Fighter 3 emulator. Playing the character Ken, each LLM was prompted with “What’s your next move” based on observation of the character position within the game environment. The LLM output was then translated into a character move.
ChatGPT 3.5 was the eventual winner above even ChatGPT 4, suggesting that speed gives an edge over accuracy in this context. Also, some commentators have suggested that the gameplay would not be able to beat a normal human player. Gameplay footage does show characters pausing, suggestive of the decision-cycle time of the LLM. This is also reflected in the poor performance of ChatGPT 4 playing Doom. Nevertheless, this represents another application of LLMs making decisions based on their environment and circumstances.
Drawing on Instagram culture, Kelin Carolyn Zhang and Ryan Mather have developed the Poetry Camera. A Raspberry Pi with a camera and a small printer are housed in a 3D printed case. On taking a picture, the camera communicates with OpenAI’s GPT-4 to generate a poem in the user’s preferred style, such as a haiku, sonnet, or free verse. The designers have open-sourced the camera, with build instructions and software available on GitHub.
AI models such as Midjourney and DALLE can turn text into pictures, yet the poetry camera is the reverse. Poetry and operational military activity are unusual bedfellows, but turning pictures into military-specific jargon may have applicability. For example, brief text descriptions of reconnaissance images could dramatically reduce the bandwidth needed to pass information, or guide analysts to focus attention on specific parts of an image.
The Alan Turing Institute (ATI) has published a paper that considers the potential of AI to help automate UK government transactions. It estimates that 84% of the 143 million complex, repetitive, customer-facing transactions could be automated by AI. (This is a proposed saving of 12% of the one billion total estimated transactions.) The MoD is one of the larger government departments (by staff numbers and budget), therefore may find efficiencies in the adoption of AI for the automation of routine tasks.
The Centre for Emerging Technology and Security (also part of the ATI) has published a paper on communicating trust and uncertainty in AI-enriched intelligence. Introduced by both the chair of the Joint Intelligence Committee and the director of GCHQ, the report advocates for the use of AI to make sense of intelligence sources, but also the need to consider bias and robustness (as non AI-enriched assessments would). The application of AI models to intelligence assessments is a natural step forward, but should be considered another tool to enhance the quality of assessments – complimenting more traditional techniques, rather than replacing them.
AI training data
Both the UK and US governments have discussed the use of copyrighted works to train AI models. The UK’s Culture, Media and Sport Committee published their review of creator remuneration, noting: “We are particularly disappointed that the Government’s working group on AI and intellectual property has failed to come to an agreement between the creative industries and AI developers on creators’ consent and compensation regarding the use of their works to train AI.” The US Congress introduced a bill: the “Generative AI Copyright Disclosure Act of 2024”, that proposes any copyrighted works used to train an AI model must be disclosed.
The use of copyrighted work (and subsequent royalties due to their creators) to train AI models is a hot topic, with lawsuits filed by content creators against generative AI companies. As the advent of digital music distribution disrupted the music industry and associated business models 30 years ago, it seems likely that the use of copyrighted materials to train AI models will also be a disruptive force; though how the dust will settle in the longer term is unclear.
Humanoid robots
Boston Dynamics retired Atlas, its hydraulically powered, humanoid robot. The following day it launched its new fully electric Atlas robot. Details are somewhat scant, but in an interview with IEEE, the CEO claimed greater joint strength than a human, and a focus on commercialisation. With backing from Hyundai, the new Atlas is destined for factory work initially.
Along with other humanoid robots from the likes of Agility, Tesla and Figure, Atlas is the next iteration of humanoid robot that can seamlessly integrate into human-centric spaces and workflows because of its form. In time, such developments are likely to further displace human workers from drudgery and dangerous tasks.
Autonomous aircraft
The US Air Force have allocated three F-16 fighter aircraft to their autonomy flying testbed programme. The aircraft will be modified to test autonomous mission and flight capabilities. The US Air Force and DARPA have also demonstrated an X-62A flown by an AI agent in a mock dogfight against a human pilot. Previously, AlphaDogFight (an AI model) has beaten human pilots in virtual environments, by flying the aircraft in ways that would have overstressed a real aircraft. These announcements are a logical next step for the training of AI agents to pilot real aircraft in combat scenarios.
Powering smart contact lenses
A team from the University of Utah have developed a way of powering smart contact lenses. Previously, wireless power transfer was required, however their dual-approach method creates electricity within the contact lens itself. The first power source uses miniaturised solar cells embedded in the lens. The second source uses the movement of blinking eyelids; a reaction between electrolytes in tears and a magnesium anode generates electricity. An integral power management system delivers 150 microwatts of power at 3.3 volts.
This user-friendly way of generating electricity seamlessly within a contact lens may open up new electronic applications hosted on the contact lens itself. This may include biosensors to monitor eye-health and consumers applications such as heart rate and blood sugar levels.
Follow Futures Lab on LinkedIn to keep up to date with our latest news.