Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
IEEE Trans Pattern Anal Mach Intell ; 45(5): 6460-6479, 2023 May.
Artigo em Inglês | MEDLINE | ID: mdl-36251911

RESUMO

In many non-stationary environments, machine learning algorithms usually confront the distribution shift scenarios. Previous domain adaptation methods have achieved great success. However, they would lose algorithm robustness in multiple noisy environments where the examples of source domain become corrupted by label noise, feature noise, or open-set noise. In this paper, we report our attempt toward achieving noise-robust domain adaptation. We first give a theoretical analysis and find that different noises have disparate impacts on the expected target risk. To eliminate the effect of source noises, we propose offline curriculum learning minimizing a newly-defined empirical source risk. We suggest a proxy distribution-based margin discrepancy to gradually decrease the noisy distribution distance to reduce the impact of source noises. We propose an energy estimator for assessing the outlier degree of open-set-noise examples to defeat the harmful influence. We also suggest robust parameter learning to mitigate the negative effect further and learn domain-invariant feature representations. Finally, we seamlessly transform these components into an adversarial network that performs efficient joint optimization for them. A series of empirical studies on the benchmark datasets and the COVID-19 screening task show that our algorithm remarkably outperforms the state-of-the-art, with over 10% accuracy improvements in some transfer tasks.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...