University of Wollongong
Browse

Improving adaboost for classification on small training sample sets with active learning

Download (203.76 kB)
conference contribution
posted on 2024-11-13, 15:39 authored by Lei WangLei Wang, Xuchun Li, Eric Sung
Recently, AdaBoost has been widely used in many computer vision applications and has shown promising results. However, it is also observed that its classification performance is often poor when the size of the training sample set is small. In certain situations, there may be many unlabelled samples available and labelling them is costly and time-consuming. Thus it is desirable to pick a few good samples to be labelled. The key is how. In this paper, we integrate active learning with AdaBoost to attack this problem. The principle idea is to select the next unlabelled sample base on it being at the minimum distance from the optimal AdaBoost hyperplane derived from the current set of labelled samples. We prove via version space concept that this selection strategy yields the fastest expected learning rate. Experimental results on both artificial and standard databases demonstrate the effectiveness of our proposed method.

History

Citation

Wang, L., Li, X. & Sung, E. (2004). Improving adaboost for classification on small training sample sets with active learning. Proceedings of Asian Conference on Computer Vision (ACCV) (pp. 1-6).

Pagination

1-6

Language

English

RIS ID

54421

Usage metrics

    Categories

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC