Abstract:
Objectives Aiming at the problem of image registration difficulties caused by the significant differences of multi-source remote sensing images due to the sensor type, temporal phase and illumination conditions, this paper proposes a multisource image registration method guided by first-order Gaussian steerable filters.
Methods First, based on the geometric reference information of the image, the overlapping area of the reference image and the image to be aligned in the image space is calculated, the overlapping area is uniformly partitioned with the reference image surface as the reference, the corresponding image block to be aligned is calculated by the rational function model and the digital elevation model, and the affine transformation model is established to geometrically correct the image block to be aligned, so as to realize the coarse registration between the local images. Second, for feature detection, anti-cluster features from accelerated segmentation test improved by the chunking uniformity checking strategy are constructed to obtain a large number of uniformly distributed feature points. And for feature description, a set of multi-scale, multi-directional first-order Gaussian steerable filters are constructed to convolve the image, and by pooling the convolution results to achieve feature dimensionality reduction, a multi-source consistent feature description is constructed. Finally, feature matching is performed based on the nearest-neighbor principle, and high-precision correspondences are obtained by eliminating mismatches. And bundle adjustment is further performed based on the rational function model to calibrate the rational polynomial coefficients of the images to be aligned and geometrically correct the images, so as to realize the fine registration of the images.
Results and Conclusions Experimental results using multiple pairs of satellite multisource images show that the accuracy of the proposed method is better than 1 pixel on multi-temporal optical data, optical-infrared data, and 1.5 pixels on optical-synthetic aperture radar data, and the computational efficiency is more than doubled compared to the existing similar methods.