Your browser doesn't support javascript.
Selecting Pseudo Supervision for Unsupervised Domain Adaptive SAR Target Classification (preprint)
researchsquare; 2022.
Preprint in English | PREPRINT-RESEARCHSQUARE | ID: ppzbmed-10.21203.rs.3.rs-1504089.v1
ABSTRACT
In recent years, deep learning has brought significant progress for the problem of Synthetic Aperture Radar (SAR) target classification. However, SAR image characteristics are highly sensitive to the change of imaging conditions. The inconsistency of imaging parameters (especially the depression angle) leads to the distribution shift between the training and test data and severely deteriorates the classification performance. To address this problem, in this paper we propose an unsupervised domain adaptation method based on selective pseudo-labelling for SAR target classification. Our method directly trains a deep model using the data from the target domain by generating pseudo labels in the target domain. The key idea is to iteratively select valuable samples from the target domain and optimize the classifier. In each iteration, the breaking ties (BT) criterion is adopted to select the best samples with the highest scores of relative confidence. Besides, to avoid error accumulation in the iterative process, class confusion regularization is used to improve the accuracy of pseudo-labelling. Our method is compared with state-of-the-art methods, including supervised classification and unsupervised domain adaptation methods, over the moving and stationary target acquisition and recognition (MSTAR) dataset. The experimental results demonstrate that the proposed method can achieve better classification performance, especially when the difference of depression angles of the source and target domain images is large. Besides, our method also shows its superiority under limited-sample conditions.

Full text: Available Collection: Preprints Database: PREPRINT-RESEARCHSQUARE Language: English Year: 2022 Document Type: Preprint

Similar

MEDLINE

...
LILACS

LIS


Full text: Available Collection: Preprints Database: PREPRINT-RESEARCHSQUARE Language: English Year: 2022 Document Type: Preprint