Operated by, in support of, or teaming up with formations, unmanned machines take missions thought to be too difficult, dangerous or plainly boring. Such missions are relevant for drones, unmanned ground vehicles and unmanned surface or underwater vessels alike.
Unmanned systems are not really unmanned but perform as remotely operated platforms. They often do so under the control of of human controllers, where certain parts of the mission are done automatically, enabling the operators to rest or focus on more important tasks. Typically, the ratio is one (or more) human controller per unmanned platform, and many more employed to sustain and maintain it over long missions.
Unmanned and autonomous machines are designed to be ‘smarter’ and more agile and self-aware; they can plan a path, identify, and avoid obstacles, perform certain manoeuvres and execute specific mission segments independently when human control is unavailable or not required. Since many tasks are delegated to the machine, a single operator can control several such semi-autonomous platforms from a single control unit, enabling more efficient, and economic use of resources.
The highest level of autonomy is represented by unmanned and autonomous platforms that can perform missions as part of combined teams, that consist of manned and unmanned units. Such teams may be equal or weigh more to the manned or unmanned side. For example, an unmanned scout team supporting a squad or platoon, would require a few drones or UGVs, each operated by a system specialist. Future combat vehicles could be operated with one or more ‘robotic wingmen’, like fighter pilots operating a combined formation of manned and unmanned wingmen. A different concept may involve a swarm of dozens, or even hundreds of mini-drones, deployed against a target of enemy concentration, with the entire swarm controlled by few operators, a single supervisor or no supervisor at all, preloaded with mission details, target allocation and rules of engagement to guide their mission. This last scenario is the closest to the science fiction doomsday scenario of the Terminator movie series.
Mission Autonomy and MUM-T
Remote control of unmanned systems has been with us for decades. In recent years, lightweight portable electronics electronics have introduced ever smaller yet more powerful and sophisticated systems, both in in the platform side as well as at the controller. With these advancements, platforms have become more autonomous and capable of assuming more tasks previously done by the human operator. Essential mission phases such as automatic take-off and landing, self-test and mission planning are common today; obstacle detection and avoidance are also becoming a reality, with combined real-time sensing and processing on board the platform, as are more specific mission capabilities such as target revisits, automatic target recognition, tracking, and geolocation. Although the drone, or UGV can perform complete missions by itself, the information it gathers along the mission requires significant analysis that currently requires human authority. With artificial intelligence and machine learning implemented, at the control centre and later the platform, the human mission operators become mission analysts, taking a supervisory role to approve or deny the machine’s decision support recommendations.
Where unmanned systems are semi-autonomous, and can perform specific tasks following specific orders, they tend to behave like soldiers, and are prepared to join the ‘Man-Unmanned Team’ (MUM-T).
A trending military capability since the mid-2000s, MUM-T has become an important concept in modern operations, where combat elements employ unmanned systems as part of their operations. While MUM-T do not relate to specific platforms, the most common use is air-air where air crews or ground elements employ unmanned aerial systems, and ground-air with ground-ground employment of unmanned ground vehicles being tested in operational experiments.
Operating a UAS from a fast-moving platform, such as a helicopter or aircraft, presents several obstacles, particularly with mounting the directional data link on the helicopter.
MUM-T Levels of Interoperability (LOI)
The employment of MUM-T is defined in five LOIs:
Level 1 – Receipt of information from the unmanned platform via its ground control element;
Level 2 – Direct receipt of information from a UAS to a manned helicopter; (via remote terminal);
Level 3 – Manned platform directly controls the payload of the UAS directly;
Level 4 – Manned platform controls the airborne platform & payload, except recovery;
Level 5 – All functions supported in Level 4 plus the ability to take off and recovery of the UAS;
Level-X – Refers to the evolving capability of a single manned platform to control multiple UAX.
The first operational platform to utilise MUM-T was implemented with AH-64D APACHE attack helicopters, converted to operate an MQ-1C GRAY EAGLE drone via a rotor-hub mounted Common datalink. This application supported Level-2 MUM-T, where the pilot or co-pilot in the modified APACHE can receive sensor video streaming from the drone. This datalink can also retransmit full motion video received from the drone, or the APACHE’s own TADS/PNVS sight with relevant metadata to another MUM-T equipped APACHE or ground element equipped with One Station Remote Video Terminal (OSRVT).
A more advanced configuration currently in the making is L3’s applique for the new and remanufactured AH-64E (ECHO). It provides APACHE aircrews with increased SA and NC interoperability while reducing sensor-to shooter timelines and increased survivability for the gunships.
This new gear expands the Echo to support MUM-T Level 3 and 4, enabling the pilot to control the drone’s payloads and flight path. The interface is not limited only to the GRAY EAGLE, as both RQ-7 SHADOW and MQ-11 RAVEN mini-UAS are supported. In fact, the gunship-drone team now assume the role of an armed scout in the US Army aviation brigades, previously done by the OH-58D KIOWA WARRIOR. The MUM-T kit added to the AH-64E includes an upper receiver (UR) mounted on top of the rotor, providing APACHE-GRAY EAGLE UAS (air-air) data link and a lower transceiver unit providing air-to-air-to-ground (AAG) line-of-sight communications that links with low flying SHADOW and RAVENs, and relays video to ground elements. The system employs different links in Ku, C, L and S bands covering both applications. This new capability is slated to become operational next year (2020).
In total, the Army plans to acquire 690 AH-64Es by 2025. The AH-64E is highly mobile, lethal and can destroy armour, personnel and material targets in obscured battlefield conditions at ranges out to 8-kilometers, an Army statement said.
To further expand capabilities the US Army plans to test Rafael Spike NLOS guided missiles with AH-64E. Fired from ranges up to 30 kilometres, Attack helicopters carrying such weapons would greatly benefit from UAS interoperability.
European MUM-T Initiatives
MUM-T capability is not restricted to the US Military. Many other air forces in Europe, Middle East and Australia are actively engaged in research, and experimentation and have formulated acquisition plan for such capabilities. General Atomics Aeronautical Systems (GA-ASI), the manufacturer of the MQ-9 REAPER and PREDATOR XP have pitched this capability for the international market since 2017, introducing MUM-T capabilities with its export-oriented PREDATOR XP drone. In its marketing, GA-ASI featured increased survivability for the manned platforms, faster target engagement with long range engagements and more efficient battle command, based on fresh information obtained via the drones.
L3 has also emphasised their MUM-T capabilities as featured in the AH-64 manned helicopter platform, MQ-8 GRAY EAGLEs, SHADOW and RAVEN UAS. According to the company, the technology can be optimised for other airborne assets used by NATO and international partners.
In 2018, Airbus became the first European helicopter manufacturer to demonstrate this technology with the highest level of interoperability. The tests held in Austria employed an Airbus H145 specially equipped and integrated with datalink and UAS control application, and a standard Schiebel S-100 CAMCOPTER. On its announcement Airbus placed a unique argument. “By controlling drones from the air, military and parapublic crews can explore tough-to-access areas and significantly expand observational capacities.” Airbus said, UAS can not only enlarge search areas but also access areas that a helicopter might find difficult. They are able to explore unknown territory and deliver information to the helicopter crew, which can then step in with the helicopter’s superior effects. Testing and certification are currently focused on military uses, but MUM-T has the potential to benefit a wide range of sectors and enable faster and more cost-effective mission completion.
On this test the drone was controlled and piloted by a third operator in the helicopter. The trials carried out by Airbus Helicopters and Schiebel went up to MUM-T LOI 5. This allows the manned platform to exercise full control of the UAS including its take-off and landing. LOI 1, the lowest level, is the indirect receipt and /or transmission of sensor data obtained by the UAS to the manned aircraft. Airbus considers operations by a third crewmember for parapublic missions (law enforcement and search and rescue) while military missions would reduce the crew to two persons.
The S-100 mission planning and control system was provided by Schiebel. The next step will be to optimise the human machine interface based on a thorough analysis of the crew workload using the results of the flight tests.
In 2018, the German Army has also experimented with MUM-T technology. During a recent flight demonstration a pilot controlled an UMS SKELDAR helicopter drone while flying a helicopter specially equipped with MUM-T mission equipment provided by ESG. The test flights verified the capability of a UAV teaming with the manned helicopter, controlled as an unmanned wingman on a command basis. An important part of the test involved automated mission planning. Missions included the provision of reconnaissance data from a location, exploration of forest perimeters between positions and the clarification of possible landing zones for the helicopter. These highly complex tasks were only possible thanks to the high degree of automation of systems in the unmanned UMAT platform, including route, sensor and data link management. The route planning took place, for example, considering threats, flight areas, sensor characteristics and data link coverage. The demonstration flights were carried out under project MiDEA (mission monitoring by drones for reconnaissance), part of the ESG MUM-T (Manned-Unmanned-Teaming) program and VTOL-UAS technology roadmap.
Another MUM-T demonstration was carried out in the UK by QinetiQ. This experiment featured Search and Rescue (SAR) missions augmented by drones. On this mission, the drone performed missions off the coast of Wales. While the drone was Flown under the control of forward ground control centre, its sensors were controlled by the Maritime Coastguard Agency (MCA) control centre, located 200 miles away from the site. The live situational awareness feed, which included marked up imagery, search status and reference points was simultaneously distributed to multiple teams at the search site in Llanbedr, and to remote sites in Fareham, London and Southampton.
Future MUM-T Initiatives in the US Army
The US Army Future Vertical Lift Cross-Functional Team (FVL CFT) is exploring various concepts of manned-unmanned teaming for a range of missions and the helicopters currently being evaluated for future procurement are expected to field with MUM-T capabilities.
US helicopter manufacturers exploring MUM-T capabilities with current and future platforms include Sikorsky, Boeing and Bell. Sikorsky explores various helicopter autonomy capabilities through the Aircrew Labor In cockpit Automation System (ALIAS) programme managed by DARPA. ALIAS provides different autonomy functions assisting the crew by reducing workload, up to the level of removing the entire crew from the aircraft. The programme provides for optionally piloted systems that would eventually enable the integration of manned/unmanned teams with existing UH-60 helicopters.
For example, Bell has unveiled an unmanned version of its V-280 VALOR tilt rotor aircraft, that could be operated as an unmanned wingman. The V-247 VIGILANTE will be able to access areas too dangerous for manned platforms and perform as a gunship suppressing and securing landing zones for manned platforms.
The LOI levels depict the procedures remote operators can do with an unmanned platform, and the functions allowed for them to follow. The drone’s ability to maintain contact with targets even at high threat situation is critical for the warfighter, enabling the manned platform to stay beyond line of sight yet maintain continuous coverage of the target. Several platforms may be used to persistently cover an area of interest from different sides and distances, designate targets and help direct guided weapons or indirect attack (mortar or artillery) to suppress the enemy and enable friendly operations in the contested area. Using tried and tested combined arms methods enhanced with MUM-T capabilities, these activities can be performed today with small UAS, even without the use of the larger drone’s armed reconnaissance capability.
The Network Enabler
Nevertheless, as such operations rely on extensive use of datalinks, operating multiple drones in a small airspace could be challenging. While larger drones rely on datalinks that have robust electronic counter-countermeasures and mission endurance that can outlast enemy jamming, the small UAS that often use lower level of communications security may be more vulnerable to electronic interference and attack. With robotics and UAX becoming commonplace in the future battlespace, a dedicated network laying the infrastructure for command, control and communications of UAX is mandatory. such a network would likely be a MANET type network that will expand and enhance resilience and data transfer capacity as more and more nodes join the network. Other networks may rely on low-earth orbit satellite constellations, enabling the manned and unmanned systems to use SATCOM datalinks that are more difficult to jam. Such networks will also prove more capable of supporting operations in urban, forested or RF-contested terrain.
The US Air Force explores such a network with the Mobile Unmanned Air Vehicle Distributed Lethality Airborne Network (MUDLAN). Currently under Joint Technical Capability Demonstration (JCTD) status, MUDLAN is expected to transition to the Air Force and Combat Commands by 2020. This network will support high data rate communications across multiple airborne and surface platforms and is designed to continue operating under electronic attack. With spectrum agility, MUDLAN can autonomously shift frequency bands to ensure continuous air, land, and sea connectivity in contested electronic warfare environments. Initial flight tests of the network nodes are scheduled this year, and demonstrate over-the-horizon, distributed communications capabilities.
Unlike a remote controller back in the Ground Control Centre, dedicated to operating and control the unmanned system and its payloads, the combatant participating in a MUM-T has many tasks, and the unmanned system is just part of those responsibilities. Therefore, the integration of MUM-T human-machine interface (HMI) is far more challenging than that of the common GCS.
Part of the solution is the Army’s standard One System Remote Video Terminal (OSRVT), developed by Textron Systems as a tool used by ground operators to receive video from airborne assets. Today, it is integrated into helicopters, or other platforms, to deliver full-motion video to the crew.
Employed with MUM-T OSRVT communicates through a ROVER 6 modem, multiband radio frequency equipment, and a directional antenna capable of relaying multiple video streams back to the command centre.
Originally employed with ground elements, OSRVT also provides real time information to ground forces aboard a UH-60 Blackhawk or CH-47 Chinook en route to an objective during an air assault mission. Such terminals provide aircrews with real time video of the landing/pick-up zone (LZ/PZ) long before their arrival. The troops disembarking from the helicopter can use the same terminal to obtain live motion video from UAS. With the current software version, the terminal provides user full control of the remote EO payload.
The Army plans to extend MUM-T by enabling an air crew to simultaneously tap several video and sensor feeds from different platforms in real-time. This capability is supported by a new system called ‘Supervisory Controller for Optimal Role Allocation for Cueing of Human Operators’ (SCORCH). This programme has been ongoing since 2013, leverages AI and system autonomy to simultaneously control ‘intelligent systems’ on up to three drones. The interface was optimised for multiple UAS control, and features a glass cockpit with touchscreen interaction, a movable game-type hand controller with its own touchscreen display, an aided target recognition system, and other advanced features.
SNC Subsidiary Kutta Technology took this approach by enhancing the OSRVT into a bi-directional remote video terminal. The company also developed a toolkit enabling the remote crew to take “supervised usage” (LOI-4) control of multiple UAS. This system can be installed and operated from the standard cockpit displays or from a compact terminal configured as a wearable kneepad, enabling any crewmember on any platform equipped to operate MUM-T functions. The system offers advanced functions for pre-planning and multi-UAS missions, delegate mission phases to different vehicles. Users can define ‘start at’ or ‘start from’ commands at certain phases, rather than control the drone or sensor to reach a certain line of sight. Remote operators can assign targets, order a vehicle to engage or disengage auto-tracking, assign a payload to ‘follow the dot’ thus keep looking over a moving vehicle, or look ahead and along a flight path to secure a route or support troops. These functions help the crew manage activities and tasks while maintaining cognitive awareness and workload in flight and in combat.
With growing autonomy, smarter unmanned systems, empowered by AI and mission automation, are better able to adapt to higher levels of human-machine teaming, taking future MUM-T operators to new heights.Such capabilities will become critical in employing large groups of UAS, in swarms or flocks, where drone will get, collaborate on, and fulfil mission commands under human supervision without the need to receive specific instructions. This approach would enable future military forces reduce manpower while increasing combat capabilities, keep the human warfighter out of immediate danger, and leave the human operator bandwidth and attention to manage the overall mission.
Tamir Eshel is a security and defence commentator based in Israel.