This page was last updated on 2024-09-16 06:05:43 UTC
Recommendations for the article Unified Training of Universal Time Series Forecasting Transformers
Abstract | Title | Authors | Publication Date | Journal/ Conference | Citation count | Highest h-index |
---|---|---|---|---|---|---|
visibility_off | Chronos: Learning the Language of Time Series | Abdul Fatir Ansari, Lorenzo Stella, Caner Turkmen, Xiyuan Zhang, Pedro Mercado, Huibin Shen, Oleksandr Shchur, Syama Sundar Rangapuram, Sebastian Pineda Arango, Shubham Kapoor, Jasper Zschiegner, Danielle C. Maddix, Michael W. Mahoney, Kari Torkkola, Andrew Gordon Wilson, Michael Bohlke-Schneider, Yuyang Wang | 2024-03-12 | ArXiv | 28 | 18 |
visibility_off | Timer: Generative Pre-trained Transformers Are Large Time Series Models | Yong Liu, Haoran Zhang, Chenyu Li, Xiangdong Huang, Jianmin Wang, Mingsheng Long | 2024-02-04 | DBLP, ArXiv | 4 | 66 |
visibility_off | A Time Series is Worth 64 Words: Long-term Forecasting with Transformers | Yuqi Nie, Nam H. Nguyen, Phanwadee Sinthong, J. Kalagnanam | 2022-11-27 | ArXiv | 546 | 34 |
visibility_off | HiMTM: Hierarchical Multi-Scale Masked Time Series Modeling with Self-Distillation for Long-Term Forecasting | Shubao Zhao, Ming Jin, Zhaoxiang Hou, Che-Sheng Yang, Zengxiang Li, Qingsong Wen, Yi Wang | 2024-01-10 | ArXiv | 0 | 6 |
visibility_off | Two Steps Forward and One Behind: Rethinking Time Series Forecasting with Deep Learning | Riccardo Ughi, Eugenio Lomurno, Matteo Matteucci | 2023-04-10 | DBLP, ArXiv | 1 | 6 |
visibility_off | Pushing the Limits of Pre-training for Time Series Forecasting in the CloudOps Domain | Gerald Woo, Chenghao Liu, Akshat Kumar, Doyen Sahoo | 2023-10-08 | ArXiv | 7 | 22 |
visibility_off | FAITH: Frequency-domain Attention In Two Horizons for Time Series Forecasting | Ruiqi Li, Maowei Jiang, Kai Wang, Kaiduo Feng, Quangao Liu, Yue Sun, Xiufang Zhou | 2024-05-22 | ArXiv | 0 | 2 |
Abstract | Title | Authors | Publication Date | Journal/Conference | Citation count | Highest h-index |