全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于LSTM、Transformer和LightGBM的机构备付金预测方法
A Prediction Method for Institutional Reserve Based on LSTM, Transformer, and LightGBM

DOI: 10.12677/CSA.2024.142025, PP. 249-259

Keywords: 深度学习,时间序列预测,长短期记忆网络,Transformer,LightGBM
Deep Learning
, Time Series Forecasting, LSTM, Transformer, LightGBM

Full-Text   Cite this paper   Add to My Lib

Abstract:

机构备付金是金融机构的重要指标之一,对于评估其稳定性和偿付能力具有重要意义。在第三方支付机构备付金集中存管的背景下,准确预测支付机构备付金的变动对于监管机构风险管理等方面具有重要价值。笔者提出了一种基于LSTM、Transformer和LightGBM的机构备付金预测模型。利用树模型针对表格数据的快速性和准确性,选取交易日志的关键特征;利用Transformer的全局上下文建模能力捕捉财务文件的局部特征;最后采用LSTM算法获取结合后的数据的长期依赖关系。实验结果表明:该模型在机构备付金方面的预测准确性优于ARMA算法、LSTM算法和时序预测Transformer模型。
Institutional reserve is one of the important indicators for evaluating the stability and solvency of financial institutions. Accurate prediction of changes in payment institution reserves is of significant value for risk management and regulation by regulatory authorities in the context of centralized custody of reserves for third-party payment institutions. This paper proposes a prediction model for institutional reserve based on LSTM, Transformer, and LightGBM. The LightGBM model is utilized to extract key features of transaction logs, because tree-based model is fast and accurate in tabular data. The Transformer model is utilized to capture the local features of financial documents with its ability to model global context. Lastly, the LSTM algorithm is employed to capture the long-term dependencies of the combined data. Experimental results demonstrate that the proposed algorithm outperforms ARMA, LSTM, and Transformer models in predicting institutional reserves.

References

[1]  黄平, 周晋. 银行日常风险管理中备付金问题研究[J]. 系统管理学报, 2013(2): 212-216.
[2]  李丽萍, 段桂华, 王建新. 基于Prophet框架的银行网点备付金预测方法[J]. 中南大学学报: 自然科学版, 2019, 50(1): 75-82.
[3]  Liu, C., Hoi, S.C.H., Zhao, P., et al. (2016) Online Arima Algorithms for Time Series Prediction. Proceed-ings of the AAAI Conference on Artificial Intelligence, 30.
https://doi.org/10.1609/aaai.v30i1.10257
[4]  洪玮. ARMA时间序列模型在自助设备现金需求预测中的应用[J]. 中国金融电脑, 2012(9): 87.
[5]  Lu, C.J., Lee, T.S. and Chiu, C.C. (2009) Financial Time Series Forecasting Using Independent Component Analysis and Support Vector Regression. Decision Support Systems, 47, 115-125.
https://doi.org/10.1016/j.dss.2009.02.001
[6]  Liu, Y., Dong, S., Lu, M. and Wang, J. (2019) LSTM Based Reserve Prediction for Bank Outlets. Tsinghua Science and Technology, 24, 77-85.
https://doi.org/10.26599/TST.2018.9010007
[7]  Taylor, S.J. and Letham, B. (2018) Forecasting at Scale. The American Statistician, 72, 37-45.
https://doi.org/10.1080/00031305.2017.1380080
[8]  Ke, G., Meng, Q., Finley, T., et al. (2017) LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Advances in Neural Information Processing Systems, 30, 3149-3157.
[9]  Kursa, M.B. and Rudnicki, W.R. (2010) Feature Selection with the Boruta Package. Journal of Statis-tical Software, 36, 1-13.
https://doi.org/10.18637/jss.v036.i11
[10]  Vaswani, A., Shazeer, N., Parmar, N., et al. (2017) Attention Is All You Need. Advances in Neural Information Processing Systems, 30, 5998-6008.
[11]  Zeng, Z., Kaur, R., Siddagangappa, S., et al. (2023) Financial Time Series Forecasting Using CNN and Transformer. arXiv: 2304.04912.
[12]  Hochreiter, S. and Schmidhuber, J. (1997) Long Short-Term Memory. Neural Computation, 9, 1735-1780.
https://doi.org/10.1162/neco.1997.9.8.1735
[13]  Zeng, A., Chen, M., Zhang, L., et al. (2023) Are Transformers Effective for Time Series Forecasting? Proceedings of the AAAI Conference on Artificial Intelligence, 37, 11121-11128.
https://doi.org/10.1609/aaai.v37i9.26317
[14]  Li, S., Jin, X., Xuan, Y., et al. (2019) Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting. Advances in Neural Information Pro-cessing Systems, 32, 5243–5253.
[15]  Shen, L. and Wang, Y. (2022) TCCT: Tightly-Coupled Convolutional Trans-former on Time Series Forecasting. Neurocomputing, 480, 131-145.
https://doi.org/10.1016/j.neucom.2022.01.039

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413