Abstract:
Objective: Normalization methods plays an important role in feature preprocessing phase not only in conventional machine learning domain but also in contemporary deep learning domain. Batch Normalization (BN) is very successful, but its performance very depends on the sample size. Therefore, many researchers try to improve it when the sample size is inadequate through adding the sample size merely in the sample information space.
Methods: This paper utilizes Bayes theory to integrate general information, prior information and sample information, to offset the inadequate sample information. In this way, it is able to estimate sample mean and sample variance more precisely and more robust especially when the sample size is small, and makes the normalized feature better fall into non- saturating region of activation function, which enables deep learning model to better describe original feature space.
Results: The top-1 test classification accuracy in the dataset of NWPU-RESISC45 has been improved by 5.64% than BN. Moreover, with the help of general information and prior information, the proposed method(BABN) is not sensitive to the sample size.
Conclusions: The experiment results show that the proposed method (Bayes Adjoint Batch Normalization, BABN) is feasible and effective, and the new method performs better in the remotely sensed image scene recognition than Batch Normalization (BN) method and other variants.