Objectives Aiming at the matching problem of unmanned aerial vehicle(UAV) thermal infrared images and optical satellite images, a deep local feature matching method based on heterogeneous landmark dataset for learning is proposed.
Methods Firstly, the gray distribution law of thermal infrared images and visible images is learned by the generative adversarial network, and the landmark dataset consisting of thermal infrared images for feature extraction model training is synthesized. Secondly, the deep invariant features are learned from the multi-modal landmark dataset by the residual network and attention mechanism. Finally, correct matching points of image pairs are obtained by matching and purifying the invariant features.
Results The performance of this method was tested experimentally and compared with KAZE, detect-and-describe network and deep local features. The results show that the adaptability of this method to the grayscale, texture, overlap rate and geometric variations is stronger, and the matching efficiency of this method is higher.
Conclusions The effectiveness of this method is proved through multiple sets of experiments. Therefore, the UAV visual navigation is provided support for.