University of Wollongong
Browse

File(s) not publicly available

Sparse Gaussian conditional random fields on top of recurrent neural networks

conference contribution
posted on 2024-11-16, 04:11 authored by Xishun Wang, Minjie ZhangMinjie Zhang, Fenghui RenFenghui Ren
Predictions of time-series are widely used in different disciplines. We propose CoR, Sparse Gaussian Conditional Random Fields (SGCRF) on top of Recurrent Neural Networks (RNN), for problems of this kind. CoR gains advantages from both RNN and SGCRF. It can not only effectively represent the temporal correlations in observed data, but can also learn the structured information of the output. CoR is challenging to train because it is a hybrid of deep neural networks and densely-connected graphical models. Alternative training can be a tractable way to train CoR, and furthermore, an end-to-end training method is proposed to train CoR more efficiently. CoR is evaluated by both synthetic data and real-world data, and it shows a significant improvement in performance over state-of-the-art methods.

Funding

Multi-Agent Solutions for the Development of Self-Organised and Self-Adapted Distributed Energy Generation Systems

Australian Research Council

Find out more...

History

Citation

Wang, X., Zhang, M. & Ren, F. (2018). Sparse Gaussian conditional random fields on top of recurrent neural networks. 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 4219-4226). Palo Alto, United States: Association for the Advancement of Artificial Intelligence.

Parent title

32nd AAAI Conference on Artificial Intelligence, AAAI 2018

Pagination

4219-4226

Language

English

RIS ID

133049

Usage metrics

    Categories

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC