Use of autonomous uninhabited aerial vehicles safely within mixed air traffic

Kuru, Kaya orcid iconORCID: 0000-0002-4279-4166 (2023) Use of autonomous uninhabited aerial vehicles safely within mixed air traffic. In: Global Conference on Electronics, Communications and Networks (GCECN2024), 22-24 April 2024, Tokyo, Japan. (Unpublished)

[thumbnail of GCECN2024.pdf]
Preview
PDF
Available under License Creative Commons Attribution No Derivatives.

172kB

Official URL: https://avouchconferences.com/2024/electronics-com...

Abstract

The safe and efficient integration of uninhabited aerial vehicles (UAV) traffic management (UTM) systems with air traffic management (ATM) systems, using intelligent autonomous approaches, is an emerging requirement where the number of diverse UAV applications is increasing on a large scale in dense air traffic environments for completing swarms of multiple complex missions flexibly and simultaneously [1]. Vehicles are becoming increasingly automated by taking on more and more tasks under improving intelligent control systems equipped with enhancing sensor technologies and Artificial Intelligence (AI) techniques from the prior automation level to the next automation level – targeting full autonomy [2]. Fully autonomous systems are human-out-of-the-loop systems that single-handedly determine the right course of action when given an autonomous task [3]. Autonomous UAVs (A-UAVs), as flying autonomous robots, with self-learning and self-decision-making abilities by executing non-trivial sequences of events with decimetre-level accuracy based on a set of rules, control loops and constraints using dynamic flight plans involving autonomous take-off and landing are taking their indispensable parts with little or no human in the loop [2] to accomplish various tasks ([4], [5], [6], [7], [8], [9], [10]).

Sensors are the main components of Autonomous Vehicles (AVs) paving the way for autonomous navigation by providing AVs with the ability to perceive the environment through continuous vehicle-environmental interaction [11]. Vehicle sensors, with multiple sensor data fusion, feed the main phases of self-driving, i.e., vehicle learning and decision-making, which are instilled with advanced artificial Intelligence (AI) [11], [12], [13]. Due to limited sensing capabilities, each autonomous vehicle has only partial information about the larger environment within partially known highly dynamic aerial space. For drones, as safety-critical systems, there is an increasing need for onboard detect & avoid (DAA) technology i) to see, sense or detect conflicting traffic or imminent non-cooperative threats due to their high mobility with multiple degrees of freedom and the complexity of deployed unstructured environments, and subsequently ii) to take the appropriate actions to avoid collisions depending upon the level of autonomy [1]. This research analyses how A-UAVs can operate safely within mixed air traffic with non-autonomous UAVs and other manned aircraft using electronic conspicuity (EC) information (e.g., [1]) and human telemanipulators to intervene remotely (e.g., [14]) in rare difficulties that the new driver, AI agent, can not tackle under exceptional conditions.


Repository Staff Only: item control page