University of Wollongong
Browse

File(s) not publicly available

Dual-scale correlation analysis for robust multi-label classification

journal contribution
posted on 2024-11-17, 13:58 authored by Kaixiang Wang, Ming Yang, Wanqi Yang, Lei Wang
Noise in label space is a major challenge in the multi-label classification problem, as the noise (including false-negative noise and false-positive noise) can affect the distribution of the label space, which causes serious interference to the performance of the learned model. Existing methods have initially solved the situation where false-negative noise and false-positive noise appear separately, but classification when the two kinds of noise appear at the same time is still a challenging problem that has not yet been solved. This paper proposed a novel method named Dual-Scale Correlation Analysis for Robust Multi-label Classification (DCAMC) to deal with the above challenge, which can effectively deal with the simultaneous occurrence of these two kinds of noise. Our proposed method is based on the dual scale correlation analysis of samples and can mainly be divided into two parts, anti-noise module and classification module. In the anti-noise module, we define novel ‘leader-labels’ and ‘rare-labels’ based on the manifold assumption under fine-grained and coarse-grained data division respectively. The novel anti-noise module can solve the problem of false-negative noise and false-positive noise simultaneously without interfering with each other; In the classification module, we use the training datasets after the anti-noise process to train the multi-label classifiers. Coarse-grained data division for classification training guarantees the generalization performance of the model while fine-grained data division ensures effective label correlations mining. The two effective modules based on dual-scale data division improve the overall classification performance. Our method has been tested on the existing datasets, and the experiments demonstrate that our method has an improvement over existing methods.

Funding

National Natural Science Foundation of China (61876087)

History

Journal title

Applied Intelligence

Language

English

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC