MULTIDRONE: MULTIple DRONE platform for media production

Funded under Grant Agreement number: 731667 – MULTIDRONE – H2020-ICT-2016-1

MULTIDRONE aims to develop an innovative, intelligent, multi-drone platform for media production to cover outdoor events, which are typically held over wide areas (at stadium/city level). The 4-10 drone team, to be managed by the production director and crew, will have: a) increased decisional autonomy, by minimizing production crew load and interventions and b) improved robustness, security and safety mechanisms (e.g., embedded flight regulation compliance, enhanced crowd avoidance, autonomous emergency landing, communications security), enabling it to carry out its mission even against adverse conditions or crew inaction and to handle emergencies. Such robustness is particularly important, as the drone team will operate close to crowds and/or may face environmental hazards (e.g., wind). Therefore, it must be contextually aware and adaptive, with increased perception of crowds and individual humans. Furthermore, as this multi-actor system will be heterogeneous, consisting of a) drones and b) the production director/crew, critical human-in-the-loop issues will be addressed to avoid decision errors or operator overload, towards maximizing shooting creativity and productivity, while minimizing production costs. The overall multiple drone system will be built to serve identified production (end user, i.e., broadcaster) needs. Namely, its innovative, safe and fast multiple drone audiovisual (AV) shooting will provide novel media production functionalities (e.g., production creativity towards rich media output, global event coverage, adaptation to event dynamics, high reaction speed to unexpected events). Both live (real-time) AV shooting and off-line productions will be considered.

MULTIDRONE involves 9 partners that include 3 leading broadcasters: DEUTSCHE WELLE (DE), RAI (IT), BBC (UK) and many other Research and Commercial partners: THALES (FR), AUTH (GR), UNIVESITY OF SEVILLA (ES), ALERION (FR), IST-ID (PT).

For more detailed and technical information, visit

ARCOW: Aerial Robot Co-Worker in Plant Servicing

Funded under Grant Agreement number: 608849 – EUROC: European Robotics Challenge -FP7-ICT

ARCOW is the framework project for the participation of GRVC group in EUROC. EUROC is a cascade-funding project that aims at boosting robotics development for European industries. EUROC is divided into three Challenges: (Challenge1) Reconfigurable Interactive Manufacturing Cell, (Challenge2) Shop Floor Logistics and Manipulation and (Challenge3) Plant Servicing and Inspection. The GRVC participates in EUROC Challenge3 with ARCOW.

The objective of ARCOW is to introduce aerial robots collaborating with humans in manufacturing processes in order to reduce their costs and making them more efficient. The project in within the EUROC Challenge3 and is coordinated by GRVC-USE.

Two main particular objectives are pursued. First, functionalities for aerial robots delivering light goods (small tools, bag of rivets, seals, etc.) to the different working stations while navigating within an aircraft manufacturing plant (a GPS-denied environment) will be developed. The second objective is the development of a low-cost localization system for costly tools or portable machinery into the plant in order to build an improved system for the identification and monitoring of goods that can cause Foreign Object Damage.

New functionalities for navigation of aerial robots in complex indoor environments will be vaidated including: new techniques for mapping and localization of aerial robots, global/local planning schemes for safe navigation in presence of humans, obstacle detection and avoidance, among others.

ARCOW involves 3 partners: GRVC-USE(ES), FADA-CATEC (ES) and AIRBUS D&S (ES).

For more detailed and technical information, visit

AEROARMS: AErial RObotic system integrating multiple ARMS and advanced manipulation capabilities for inspection and maintenance

Funded under H2020-ICT-2014 call 1, Topic ICT-23-2014 Robotics

AEROARMS proposes the development of the first aerial robotic system with multiple arms and advanced manipulation capabilities to be applied in industrial inspection and maintenance (I&M). The objectives are:

  • 1. R&D on aerial manipulation to perform I&M. This includes:
    • 1.1 Based on previous partner results, developing systems which are able to grab and dock with one or more arms and perform dexterous accurate manipulation with another arm. Also develop helicopter-based aerial manipulators, with greater payload and flight endurance, and with a dexterous arm to provide advanced manipulation capabilities by means of force interactions and hand-eye coordination using a movable camera with another light arm
    • 1.2 New methods and technologies for platforms which can fly and manipulate with the coordinated motion of the arms addressing constrained scenarios in which it is dangerous to use the helicopter and where it is not possible to grab to perform I&M operation.
  • 2. Validation of 1.1 in two applications:
    • 1) Installation and maintenance of permanent NDT sensors on remote components;
    • 2) Deploy and maintain a mobile robotic system permanently installed on a remote structure.

    To achieve the above objectives AEROARMS will develop the first aerial telemanipulation system with advanced haptic capabilities able to exert significant forces with an industrial robotic arm, as well as autonomous control, perception and planning capabilities. Special attention will be paid to the design and system development in order to receive future certification taking into account ATEX and RPAS regulations. AEROARMS is strongly related to ICT 23–2014: Robotics enabling the emergence of aerial robots, with manipulation capabilities to operate in industrial I&M, which will be validated in in oil and gas plants to reach TRL5.

The consortium combines excellent capabilities in aerial robotics with leadership in aerial manipulation and key partners for the successful application of I&M.

For more detailed and technical information, visit

AEROBI: AErial RObotic System for In-Depth Bridge Inspection by Contact

Funded under: H2020-EU.2.1.1; Call for proposal: H2020-ICT-2015; Topic(s): ICT-24-2015 – Robotics 

The latest developments in low flying unmanned robots with arms and the associated fields of intelligent control, computer vision and sensors open the floor for robotic solutions, exploitable in the near term, in the field of inspection of difficult-toaccess areas of the civil infrastructure in general and bridges in particular. The latter infrastructure is ageing requiring inspection and assessment. Presently, bridge inspection is primarily done through visual observations by inspectors. It relies upon the inspector having access to bridge components via access equipment (ladders, rigging and scaffolds) and vehicular lifts (manlifts, bucket trucks and under-bridge inspection vehicles). This is uncomfortable and potentially dangerous for the inspectors, while it interferes with the traffic contributing to bottlenecks and congestion. The results of the inspection are used to structurally assess the bridge in a following step.

AEROBI figure 2

AEROBI figure 1
Bridge Elements and Terminology

AEROBI, driven by the bridge inspection industry, adapts and integrates recent research results in low flying unmanned robots with arms, intelligent control in robotics, computer vision and sensing, in an innovative, integrated, low flying, robotic system with a specialised multi-joint arm that will scan concrete beams and piers in a bridge for potential cracks on the surface or concrete swelling or spalling. In case the width of the above cracks exceeds given limits, it will measure distance between parallel cracks, while it will contact the bridge to non-destructively measure the depth of cracks and deformation. In case of concrete swelling or spalling it will also contact the bridge to non-destructively measure delamination and the diameter of the reinforcing steel bars. The above will provide input for a structural bridge assessment that will be automatically performed by the proposed robotic system. The latter system, which is expected to be exploitable in the short term, will be field evaluated and demonstrated at two actual bridges.

For more detailed and technical information, visit

ARCAS: AErial RObotics Cooperative Assembly System

Large Scale Integrating Project funded under FP7-ICT-2011 call 7, Topic ICT-2011.2.1 Cognitive Systems and Robotics

ARCAS (Aerial Robotics Cooperative Assembly System) is a RD project funded under FP7-ICT-2011 call. The ARCAS project proposes the development and experimental validation of the first cooperative free-flying robot system for assembly and structure construction. The detailed scientific and technological objectives are:

  1. New methods for motion control of a free-flying robot with mounted manipulator in contact with a grasped object as well as for coordinated control of multiple cooperating flying robots with manipulators in contact with the same object (e.g. for precise placement or joint manipulation)
  2. New flying robot perception methods to model, identify and recognize the scenario and to be used for the guidance in the assembly operation, including fast generation of 3D models, aerial 3D SLAM, 3D tracking and cooperative perception
  3. New methods for the cooperative assembly planning and structure construction by means of multiple flying robots with application to inspection and maintenance activities
  4. Strategies for operator assistance, including visual and force feedback, in manipulation tasks involving multiple cooperating flying robots

ARCAS will pave the way for a large number of applications including the building of platforms for evacuation of people or landing aircrafts, the inspection and maintenance of facilities, the construction of structures in inaccessible sites and the space applications.

The project will be implemented by a high-quality consortium whose partners have already demonstrated the cooperative transportation by aerial robots as well as high performance cooperative ground manipulation. The team has the ability to produce for the first time challenging technological demonstrations with a high potential for generation of industrial products upon project completion.

For more detailed and technical information, visit

EC-SAFEMOBIL: Estimation and Control for SAFE wireless high MOBILity cooperative industrial systems

FP7-ICT-288082, IP, Call FP7-ICT-2011-7, Challenge: 3, ICT 2011.3.3, New paradigms for embedded systems, monitoring and control towards complex systems engineering

Autonomous systems and unmanned aerial vehicles (UAVs), can play an important role in many applications including disaster management, and the monitoring and measurement of events, such as the volcano ash cloud of April 2010. Currently, many missions cannot be accomplished or involve a high level of risk for the people involved (pilots and drivers), as unmanned vehicles are not available or not permitted. This also applies to search and rescue missions, particularly in stormy conditions, where pilots need to risk their lives. These missions could be performed or facilitated by using autonomous helicopters with accurate positioning and the ability to land on mobile platforms such as ship decks. These applications strongly depend on the UAV reliability to react in a predictable and controllable manner in spite of perturbations, such as wind gusts. On the other hand, the cooperation, coordination and traffic control of many mobile entities are relevant issues for applications such as automation of industrial warehousing, surveillance by using aerial and ground vehicles, and transportation systems. EC-SAFEMOBIL is devoted to the development of sufficiently accurate common motion estimation and control methods and technologies in order to reach levels of reliability and safety to facilitate unmanned vehicle deployment in a broad range of applications. It also includes the development of a secure architecture and the middleware to support the implementation. Two different kind of applications are included in the project:

  1. Very accurate coupled motion control of two mobile entities. The technologies will be demonstrated in two challenging applications dealing with the landing on mobile platforms and launching of unmanned aerial vehicles from a manned vehicle.
  2. Distributed safe reliable cooperation and coordination of many high mobility entities. The aim is to precisely control hundreds of entities efficiently and reliably and to certify developed techniques to support the exploitation of unmanned platforms in non-restricted areas. This development will be validated in two scenarios: industrial warehousing involving a large number of autonomous vehicles and surveillance also involving many mobile entities

MONIF: Forest fire monitoring and measurement employing helicopters.

Funded by FAASA Aviación and Corporación Tecnológica de Andalucia.

This project deals with the development of new techniques and methods for efficient extinguishing of forest fires by using aerial means. Particularly, the final objective of the project is supporting forest fire fighting with new and precise information about fires by means of aerial sensing and measurement. Thus, aeronautic technologies together with information gathering and processing are integrated into the project.

MONIF will provide real time information about fire fronts and contours, these data will significantly contribute to improve the estimation of the required technical and human resources for efficient forest fire fighting. This new method for quantifying and evaluating forest fires threats is a clear step forward for ambient protection and for the security of the human resources involved in this work.

The project started the last trimester of year 2009. During these three months the work was focused on designing the software architecture and the hardware of the system onboard the helicopter.

ROBMAN: Intelligent robotic system for handling large aircraft structures

FEDER-INNTERCONECTA programme from Spanish CDTI, funded by MC2. PI-0985/2012.

The ROBMAN project aims to design and develop an automated lifting system suspended from an overhead crane, which will allow handling and transport of various large components of the aircraft

AICIA participation in the project is focused on the design and development of the control algorithms, static and dynamic study of all the existing parameters and the modeling and simulation of a system for the automation of the manipulation and transportation of aircraft structure parts. This year AICIA has developed control system simulation and manipulation of 3D parts.


ATICA: All-Terrain Intelligent Compact and Autonomous Vehicle

PI-0991/2012. Program FEDER-INNTERCONECTA from CDTI, funded by Grupo Iturri.

The main goal of this project is the design and development of an off-road autonomous vehicle for transport applications. In the last year we have designed the hardware and software. Moreover, the sensor modules (laser, GPS, IMU) and several software modules have been implemented and tested in the laboratory and in outdoor experiments. The communication system has also been developed and validated including the hardware as well as the software and protocol based on the JAUS architecture with level of compliance 1. Moreover several modules, like the error management module and the actuators interface module have been developed. The actuators interface module send references and receive status information from the vehicle actuators through a CAN J1939 communication network.


GRIFFIN: General compliant aerial Robotic manipulation system Integrating Fixed and Flapping wings to Increase range and safety

ERC Advanced Grant 2017

The goal of GRIFFIN is the derivation of a unified framework with methods, tools and technologies for the development of flying robots with dexterous manipulation capabilities. The robots will be able to fly minimizing energy consumption, to perch on curved surfaces and to perform dexterous manipulation. Flying will be based on foldable wings with flapping capabilities. They will be able to safely operate in sites where rotorcrafts cannot do it and physically interact with people.  Dexterous manipulation will be performed maintaining fixed contact with a surface, such as a pole or a pipe, by means of one or more limbs and manipulating with others overcoming the limitations of dexterous manipulation in free flying of existing aerial manipulators. Compliance will play an important role in these robots and in their flight and manipulation control methods. The control systems will be based on appropriate kinematic, dynamic and aerodynamic models. The GRIFFIN robots will have autonomous perception, reactivity and planning based on these models. They will be also able to associate with others to perform cooperative manipulation tasks. New software tools will be developed to facilitate the design and implementation of these complex robotic systems. Thus, configurations with different complexity could be derived depending on the requirements of flight endurance and manipulation tasks from simple grasping to more complex dexterous manipulation. The implementation will be based on additive and shape deposition manufacturing to fabricate multi-material parts and parts with embedded electronics and sensors. In GRIFFIN we will develop a small flapping wings proof of concept prototype which will be able to land autonomously on a small surface by using computer vision, a manipulation system with the body attached to a pole, and finally full size prototypes which will demonstrate flying, landing and manipulation, including cooperative manipulation, by maintaining the equilibrium.