Chandrakanth, V. and Murthy, V. S. N. and Channappayya, Sumohana S.
(2022)
UAV-based autonomous detection and tracking of beyond visual range (BVR) non-stationary targets using deep learning.
Journal of Real-Time Image Processing, 19 (2).
pp. 345-361.
ISSN 1861-8200
Full text not available from this repository.
(
Request a copy)
Abstract
Aerial surveillance and tracking have gained significant traction in recent years for both civilian applications and military reconnaissance. Disaster analysis, emergency medical response, pandemic spread analysis, etc. have significantly improved with the availability of aerial data. The next big step is to push the system for autonomous detection and tracking of targets beyond visual range (BVR). Presently, this is done using GPS-based techniques in which the target information is assumed to be precisely known. In situations where such information is unavailable or if the target of interest is non-stationary, this method is not applicable and currently, no alternative exists. In this work, we aim to address this limitation and propose a deep learning-based algorithm for terminal guidance of aerial vehicle BVR with only bearing information about the target of interest. The algorithm operates in search and track modes. We describe both the modes and also discuss the challenges associated with this kind of deployment in real time. Since the weight and power requirements of the payload directly translate to the cost of deployment and endurance of aerial vehicles, we have configured a custom lightweight convolutional neural network (CNN) with minimal layers and successfully deployed the system on Jetson Nano, the smallest GPU available from NVIDIA as of this writing. We evaluated the performance of the proposed algorithm on proprietary and open-source datasets and achieved detection accuracy greater than 98.6% on custom datasets. © 2021, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.
Actions (login required)
|
View Item |