Architecture for autonomous multi-drone cinematography planning
The process of getting video footage with aerial vehicles in an autonomous manner is challenging. It imposes a series of difficulties related with robot positioning and navigation, smooth control of the camera, collision avoidance, etc., mainly when the application is outdoors.
The MultiDrone architecture has the following requirements:
➤The system should be able to reproduce typical shots from cinematography rules autonomously, shooting static and mobile targets.
➤It should also ensure smooth transitions along the shots while at the same time implementing collision avoidance and being aware of no-fly zones, security and emergency situations.
➤Finally, it should consider the limited resources of the drones such as the battery life.
We propose an architecture (see image below) where planning can take place at different phases and with different modules. For instance, computing a safe path to a landing spot or to a specific shooting position can be considered planning, but distributing different shooting tasks among the team members and coordinating them is also planning. Moreover, planning will also be an online functionality. Given the commands from the director, an initial plan can be computed. However, during the execution of the plan, the original circumstances may vary, making that initial plan no longer useful. Imagine for instance that some parts of the plan were not successfully accomplished (uncertainties in drones’ actions or targets’ movements), that there were new commands from the director, or that unexpected emergency events happened. All those situations would trigger a new planning phase at the higher level.

MultiDrone architecture for autonomous cinematography
In the system there will exist a High-level Planner in charge of interpreting high-level commands from the cinematography director (specified through the Dashboard), translate them into different tasks (e.g., positions to visit and specific shots to be taken) and distribute them among the drones of the team. The whole set of high-level commands specified by the director will be denoted as the Shooting Mission, which will be split into sequential or parallel Shooting Actions to be performed by the different drones in the team. Thus, each drone will incorporate the necessary functionalities to perform its assigned Shooting Actions. The Scheduler will manage the tasks assigned and will send low-level commands to follow planned trajectories or track targets, making use of additional path planners depending on each action. As stated above, the high-level planner will be used for pre-planning before the flight, but also for re-planning after unexpected events or new director’s commands.
In the following years, we will develop the different modules proposed in the architecture and they will be tested in real scenarios.