Stjepan Bogdan, University of Zagreb, Mirko Kovac, Imperial College London
and Jose Ramiro Martinez de Dios, University of Seville
Robots’ manipulative and cognitive abilities have risen to a new level over the past two decades, enabled by the emergence of new materials, advances in computer science, and rapid development and miniaturisation of electronic components. This new level enabled the implementation of novel structures and complex algorithms, introducing robots into new fields, such as aerial robotics.
Unmanned Aerial Systems (UASs) are likely to become one of the most influential technologies in future decades. Equipped with various sensors and actuators, UASs can provide a plethora of services and tasks (e.g. observation, inspection, mapping, search and rescue, maintenance, etc.). While some of these services are already offered commercially (e.g. inspection of industrial facilities by manually controlled UAS), others require an aerial robotic system to interact physically with objects (like maintenance) and are still at the level of laboratory experimentation. Conveying functionalities such as aerial tactile inspection or aerial repair, construction, and assembly requires further development of modelling and control techniques for aerial robotic systems and adjustments of legislation.
Within this framework, the AeRoTwin project strives to enhance the R&D capacity of the Laboratory for Robotics and Intelligent Control Systems (LARICS) at the University of Zagreb Faculty of Electrical Engineering and Computing (UNIZG-FER) by close cooperation with the leading EU research institutions: Universidad de Sevilla, The Robotics, Vision and Control Laboratory (GRVC), Seville, Spain; Imperial College of Science Technology and Medicine, Aerial Robotics Lab (ICL ARL), London, United Kingdom; and Corporación Tecnológica de Andalucía (CTA), Seville, Spain.
The objective to increase UNIZG-FER research excellence and innovation capacity in aerial robotics is envisioned in three strategic research domains: cooperative aerial robotic missions, aerial robot navigation, and aerial robot reconfigurability.
Cooperative aerial robotic missions
No single robot is like a Swiss Army knife, a multi-purpose machine. But, when put to work together, they can accomplish various tasks set before them (Figure 1). To achieve the overall goal of decision making in complex mission scenarios in heterogeneous robotic systems in a decentralised manner, one has to tackle the problems of task planning (task decomposition and allocation) and motion planning.
One of the ways to tackle the problem of complex high-level task planning is by dividing it into two stages. The first stage breaks the problem down into smaller pre-programmed tasks. After that, a global planning coordination framework uses these tasks as building blocks for various solutions, quantifies them and finally chooses the set of tasks and robots that offer the best solution for the mission under user-defined constraints.
The main challenge in devising decentralised coordination in a heterogeneous robotic system is the definition of a multi-layered control structure that includes high-level planning and coordination combined with low-level reactive execution and supervision (Figure 2). At the high-level planning, one of the challenges is related to resource conflict—a situation when multiple requests targeting the same resource arrive simultaneously. Another type of conflict is the space sharing problem related to collisions, congestion and deadlocks.
AeroTwin partners proposed a framework where aerial robotic teams are observed as multi-agent systems interacting on communication graph topologies. Tools from directed graph theory capture asymmetric interactions among networked agents with complex, uncertain, nonlinear dynamics. In decentralised sensing, each agent interacts only with its immediate neighbours, yet system equilibrium is sought. This is not the same as standard centralised sensing of multi-agent systems, where each agent has a designated ground station and/or an operator.
AeroTwin partners demonstrated another example of a cooperative robotic mission that samples water in the lake and marine environments. Researchers at the ICL ARL have innovated a novel robotic water sampling tool called Multi-Environment Dual-robot for Underwater Sample Acquisition (MEDUSA) (Figure 3). This platform combines the technologies of autonomous control, artificial intelligence and underwater soft micro-vehicles. The multicopter aerial platform lands on the water and releases an underwater robot tethered to the aerial platform. The robot can dive up to a depth of up to 10 metres below the water surface to collect water samples with an on-board filtering system.
Aerial robot navigation
In general, commercial unmanned aerial vehicles (UAVs) are semi‐autonomous with a small degree of autonomy. Their autopilots can stabilise the vehicle, perform waypoint navigation using a global navigation satellite system (GNSS), do simple geofencing and, in some cases, avoid obstacles. GNSS signals can be restricted in some outdoor areas (tunnels, woods, etc.) or significantly degraded due to environmental conditions. GNSS receivers can also be subject to variable measurement delays. Even when a clear GNSS signal is available, a localisation accuracy of the GNSS system alone can be inappropriate for some demanding tasks (for example, autonomous inspection and maintenance of industrial facilities).
Self localisation and mapping (SLAM) is a fundamental research problem in robotics that has been studied intensively for the past two decades. The task of a SLAM implementation is to create a consistent world map from sensor observations to provide a ground for the autonomous behaviour of the system in its environment (Figure. 4). A recent step forward demonstrated real‐time 6DoF SLAM using UAV on-board sensors, a camera or LIDAR, and an inertial measurement unit (IMU), operating in outdoor environments to enhance GNSS based localisation, or working indoor as a stand-alone system. SLAM estimates a discrete UAV trajectory with associated local map samples, which are conditionally independent. Local maps are registered into a global map upon request, e.g. in the vicinity of objects where local obstacle avoidance is required.
AeroTwin partners demonstrated the functionality of the SLAM software module that is a part of the high-level UAV control. It uses LIDAR, GNSS and IMU data to create an environment map and localise the UAV, i.e. determine the position and orientation of the UAV in the inertial coordinate system (Figure 5). Another module, related to the UAV localisation in inspection scenarios, has the main goal to localise the known 3D CAD model of the wind turbine in a global coordinate frame (Figure 6). At the beginning of the inspection process, the operator positions the UAV in front of the wind turbine tower at an approximate distance of 20–25 metres. Since the precise distance between the tower and the UAV is not known, as well as the orientation of the vehicle, the 3D model has to be matched with the real wind turbine so that the pre-planned inspection trajectory corresponds with the wind turbine orientation.
Aerial robot reconfigurability
There are limitations posed on small-scale robots operating in different fluids. The main challenge is to design the vehicle that meets the design requirement for both aerial and aquatic operations. A balance between minimised weight for take-off, adequate waterproofing and bulky buoyancy elements is needed.
SailMAV, a sailing micro aerial vehicle, is a novel design for a hybrid robot capable of both aerial and water surface locomotion (Figure 7). The robot has a three-part folding wing design, making it capable of both flying and sailing. To make the system more efficient, the robot employs the same control surfaces for flying as for sailing, and it shares the same structural elements and control systems for both modes of mobility.
Project Summary
AeRoTwin is a twinning coordination action for spreading excellence in aerial robotics. The project’s overall goal is to decrease networking gaps and deficiencies between UNIZG-FER and internationally-leading counterparts in the EU by significantly enhancing the S&T capacity of the Laboratory for Robotics and Intelligent Control Systems (LARICS) at UNIZG-FER. The strategic research domains are cooperative robotic missions, aerial robot navigation and aerial robot configurability.
Project Partners
University of Seville (USE), Spain
Imperial College London, UK
Technological Corporation of Andalusia, Spain
Project Lead Profile
Stjepan Bogdan PhD is a full professor at the Laboratory for Robotics and Intelligent Control Systems (LARICS) at the University of Zagreb, where he teaches several courses in robotics and automation. His research interests include autonomous systems, aerial robotics, multi-agent systems, intelligent control systems, bio-inspired systems and discrete event systems. He spent one year as Fulbright researcher at the Automation and Robotics Research Institute, Arlington, USA, in Prof. Frank Lewis’ lab.
Bogdan has published more than 180 conference and journal papers, is the co-author of four books and has been a PI and researcher on 24 national and international scientific projects.
Project Contacts
Stjepan Bogdan
Faculty of Electrical Engineering and Computing
Unska 3, HR-10000 Zagreb, Croatia.
+385 1 6129 795
Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 810321.
Figure 1: An artistic view of a heterogeneous robotic system (an UAV, a ground robot, and a compliant multi-degree of freedom manipulator) working together on indoor organic agriculture tasks (project Specularia, Croatian Science Foundation).
Figure 2: A point-cloud of residential area obtained by a group of UAVs in joint mission, with on-board LIDAR sensors (project ENCORE, H2020 programme).
Figure 3: A Multi-Environment Dual-robot for Underwater Sample Acquisition (project MEDUSA).
Figure 4: Lidar-based map of an industrial setting created autonomously by DARIUS aerial robot (project AEROARMS, H2020 programme).
Figure 5: Autonomous aerial robot performing contact bridge inspection (project RESIST, H2020 programme).
Figure 6: The process of the wind turbine 3D model matching during an autonomous inspection (project ESMERA, H2020 programme).
Figure 7: Robot designed to execute a repeated sail-fly-sail cycle and reconfigure from sailing morphology to flying morphology (project SailMAV).