Degree Name

Doctor of Philosophy


School of Computing and Information Technology


Computer vision aims at producing numerical or symbolic information, e.g., decisions, by acquiring, processing, analyzing and understanding images or other high-dimensional data. In many contexts of computer vision, the data are represented by or converted to covariance-based representations, including covariance descriptor and sparse inverse covariance estimation (SICE), due to their desirable properties. While enjoying beneficial properties, covariance representations also bring challenges. Both covariance descriptor and SICE matrix belong to the set of symmetric positive-definite (SPD) matrices which form a Riemannian manifold in a Euclidean space. As a consequence of this special geometrical structure, many learning algorithms which are developed in Euclidean spaces cannot be directly applied to covariance representations because they do not take such geometrical structure into consideration. However, the increasingly wider applications of covariance representations in computer vision tasks urge the need for advanced methods to process or analyze covariance representations.

This thesis aims to develop advanced learning methods for covariance representations in computer vision. This goal is achieved from four perspectives. 1) This thesis first proposes a novel kernel function, discriminative Stein kernel (DSK), for covariance descriptor. DSK is learned in a supervised manner through eigenvalue adjustment. 2) Then this thesis pushes forward the application of covariance representations. This thesis finds that the high dimensionality of SICE matrices can adversely affect the classification performance. To address this issue, this thesis uses SPD-kernel PCA to extract principal components to obtain a compact and informative representation for classification. 3) In order to fully utilize the complementary information in SICE matrices at multiple sparsity levels, this thesis develops a subject-adaptive integration of SICE matrices for joint representation and classification. 4) Furthermore, considering the issues encountered by covariance descriptor in dealing with the case of high feature dimensionality and small sample size, this thesis generalizes covariance descriptor with a kernel matrix over feature dimensions. By doing this, the fixed form of covariance descriptor is extended to an open framework of kernel matrix based representations.