HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction
conference contribution
posted on 2024-11-14, 11:38 authored by Linyi Yang, Tin Ng, Barry Smyth, Riuhai Dong© 2020 ACM. The volatility forecasting task refers to predicting the amount of variability in the price of a financial asset over a certain period. It is an important mechanism for evaluating the risk associated with an asset and, as such, is of significant theoretical and practical importance in financial analysis. While classical approaches have framed this task as a time-series prediction one - using historical pricing as a guide to future risk forecasting - recent advances in natural language processing have seen researchers turn to complementary sources of data, such as analyst reports, social media, and even the audio data from earnings calls. This paper proposes a novel hierarchical, transformer, multi-task architecture designed to harness the text and audio data from quarterly earnings conference calls to predict future price volatility in the short and long term. This includes a comprehensive comparison to a variety of baselines, which demonstrates very significant improvements in prediction accuracy, in the range 17% - 49% compared to the current state-of-the-art. In addition, we describe the results of an ablation study to evaluate the relative contributions of each component of our approach and the relative contributions of text and audio data with respect to prediction accuracy.
History
Citation
Yang, L., Ng, T. L. J., Smyth, B. & Dong, R. (2020). HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction. The Web Conference 2020 - Proceedings of the World Wide Web Conference, WWW 2020 (pp. 441-451).Parent title
The Web Conference 2020 - Proceedings of the World Wide Web Conference, WWW 2020Pagination
441-451Publisher website/DOI
Language
EnglishRIS ID
144002Usage metrics
Categories
Keywords
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC