검색 상세

Negative Regularization : Prevent Overfitting to Clean Data in Noisy Dataset

초록/요약

The input data and data labels are required in supervised learning. However, labeling is an expensive task, and if automated, there is no guarantee that the label will always be correct. There are various methods to solve noisy label problems. Previous works are solved by reinforcing the direction of the gradient of clean data and neutralized the direction of the gradient of noisy labels. However, if the gradient is continuously strengthened for clean data, overfitting occurs, which reduces generalization performance. We refined the model's prediction to converge the gradient direction of the noisy data to the clean data direction. And we add decay to prevent convergence to the noisy label through regularization. In this paper, we experimentally show that the method of strengthening the gradient direction of clean data and neutralizing the gradient of noisy labels is overfitting for clean data and that overfitting is prevented by applying our proposed method. It also shows that the performance is improved compared to other SOTA methods. As a result, our proposed method proposes regularization in noisy labels environment, which prevents overfitting to clean data and proposes negative regularization (NR), which improves performance by strengthening in the direction of real labels for noisy labels.

more

목차

Ⅰ. Introduction 1
Ⅱ. Related Work 6
Ⅲ. Proposed Method 10
A. Refining Target Probability 10
B. Refining Target Probability with Decay 11
C. Negative Regularization 12
Ⅳ. Experimental Results 15
A. Datasets 15
B. Refining Target Probability 17
C. Refining target Probability with Decay 18
D. Overfitting Clean Data 20
E. Comparison with State-of-the-art Methods 23
Ⅴ. Conclusion 25
Reference 25

more