Boosting Crowdsourced Annotation Accuracy: Small Loss Filtering and Augmentation-Driven Training
Boosting Crowdsourced Annotation Accuracy: Small Loss Filtering and Augmentation-Driven Training
Blog Article
Crowdsourcing platforms provide an efficient and cost-effective means to acquire the extensive labeled data necessary for supervised learning.However, the labels provided by untrained crowdsourcing workers often contain a considerable amount of noise.Although the application of ground truth inference algorithms to deduce integrated labels effectively enhances label quality, a certain level of noise persists.
To further diminish the noise within crowdsourced labeling, Liquid Soap this paper introduces a novel Small Loss-based Noise Correction algorithm (SLNC).SLNC first filters the crowdsourced data, leveraging the characteristic of neural networks to preferentially fits clean samples, thereby obtaining relatively clean and noisy sets.It then employs data augmentation techniques to enhance the clean set and subsequently trains the corrector on this augmented set to rectify the noisy set.
SLNC has been evaluated using 16 simulated and two real-world datasets.The results Battery Straps indicate that SLNC surpasses comparative algorithms in the quality of the final labels.