全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于Transformer的时间序列插补技术研究
Research on Transformer-Based Time Series Imputation Technique

DOI: 10.12677/jisp.2024.132014, PP. 151-162

Keywords: 时间序列,多元时间序列,缺失值插补,Transformer 模型,时间序列建模,数据完整性,自注意力,神经网络
Time Series
, Multivariate Time Series, Missing Value Imputation, Transformer Model, Time Series Modeling, Data Completeness, Self-Attention, Neural Network

Full-Text   Cite this paper   Add to My Lib

Abstract:

本文旨在解决多元时间序列数据中的缺失值插补问题,提升时间序列数据插补的效果。时间序列数据是反映随时间变化的随机变量的结果,在物联网应用中得到广泛应用。然而,数据缺失问题是时间序列处理中的一个重要挑战,因为大多数下游算法需要完整的数据进行训练。本文通过总结以往时间序列建模过程中采用的插补方法,改进了一种基于Transformer模型的插补模型,并在多个数据集中验证了本文中插补模型的效果。通过本文的研究,可提高时间序列预测的准确性和实用性,对于物联网应用和其他领域中的时间序列分析具有一定的实用价值。
This article aims to address the issue of missing value imputation in multivariate time series data to enhance the effectiveness of imputation. Time series data, widely utilized in Internet of Things (IoT) applications, reflects the outcomes of random variables changing over time. However, data missingness poses a significant challenge in time series processing, as most downstream algorithms require complete data for training. By summarizing past imputation methods used in time series modeling and improving a Transformer-based imputation model, this paper validates the effectiveness of the proposed imputation model across multiple datasets. The research presented in this paper can improve the accuracy and practicality of time series prediction, providing practical value for time series analysis in IoT applications and other domains.

References

[1]  Cao, W., Wang, D., Li, J., et al. (2018) Brits: Bidirectional Recurrent Imputation for Time Series. Advances in Neural Information Processing Systems, 31, 6775-6785.
[2]  Che, Z., Purushotham, S., Cho, K., et al. (2018) Recurrent Neural Networks for Multivariate Time Series with Missing Values. Scientific Reports, 8, Article No. 6085.
https://doi.org/10.1038/s41598-018-24271-9
[3]  Yoon, J., Zame, W.R. and van der Schaar, M. (2018) Estimating Missing Data in Temporal Data Streams Using Multi-Directional Recurrent Neural Networks. IEEE Transactions on Biomedical Engineering, 66, 1477-1490.
https://doi.org/10.1109/TBME.2018.2874712
[4]  Luo, Y., Cai, X., Zhang, Y., et al. (2018) Multivariate Time Series Imputation with Generative Adversarial Networks. Advances in Neural Information Processing Systems, 31, 1596-1607.
[5]  Luo, Y., Zhang, Y., Cai, X., et al. (2019) E2gan: End-to-End Generative Adversarial Network for Multivariate Time Series Imputation. Proceedings of the 28th International Joint Conference on Artificial Intelligence, Macao, 10-16 August 2019, 3094-3100.
https://doi.org/10.24963/ijcai.2019/429
[6]  Liu, Y., Yu, R., Zheng, S., et al. (2019) Naomi: Non-Autoregressive Multiresolution Sequence Imputation. Advances in Neural Information Processing Systems, 32, 11236-11246.
[7]  Miao, X., Wu, Y., Wang, J., et al. (2021) Generative Semi-Supervised Learning for Multivariate Time Series Imputation. Proceedings of the AAAI Conference on Artificial Intelligence, 35, 8983-8991.
https://doi.org/10.1609/aaai.v35i10.17086
[8]  Fortuin, V., Baranchuk, D., R?tsch, G., et al. (2020) Gp-Vae: Deep Probabilistic Time Series Imputation. The 23rd International Conference on Artificial Intelligence and Statistics, Online Conference, 26-28 August 2020, 1651-1661.
[9]  Ashman, M., So, J., Tebbutt, W., et al. (2020) Sparse Gaussian Process Variational Autoencoders. arXiv preprint arXiv:2010.10177.
[10]  Vaswani, A., Shazeer, N., Parmar, N., et al. (2017) Attention Is All You Need. Advances in Neural Information Processing Systems, 30, 6000-6010.
[11]  Wen, Q., Zhou, T., Zhang, C., et al. (2022) Transformers in Time Series: A Survey. arXiv preprint arXiv:2202.07125.
[12]  Ma, J., Shou, Z., Zareian, A., et al. (2019) CDSA: Cross-Dimensional Self-Attention for Multivariate, Geo-Tagged Time Series Imputation. arXiv preprint arXiv:1905.09904.
[13]  Bansal, P., Deshpande, P. and Sarawagi, S. (2021) Missing Value Imputation on Multidimensional Time Series. Proceedings of the VLDB Endowment, 14, 2533-2545.
https://doi.org/10.14778/3476249.3476300
[14]  Du, W., C?té, D. and Liu, Y. (2023) Saits: Self-Attention-Based Imputation for Time Series. Expert Systems with Applications, 219, Article 119619.
https://doi.org/10.1016/j.eswa.2023.119619
[15]  Liu, Y., Hu, T., Zhang, H., et al. (2023) iTransformer: Inverted Transformers Are Effective for Time Series Forecasting.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413