Skip to content

LandonZhang/PatchTST-Fusion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PatchTST-Fusion

Key designs

🌟 Patching: segmentation of time series into subseries-level patches, which are used as input tokens for Transformer forecasting.

🌟 Cross-channel fusion: dynamic interaction across multiple feature channels to overcome the limitation of channel independence.

🌟 Additive attention: adaptive weighting of channel-wise predictive signals for informative feature fusion.

🌟 Residual refinement: residual correction of the target-channel prediction for stable and accurate forecasting.

Model Structure

Results

Baselines Comparison

This paper selects two representative baseline models for comparative experiments: (1) GARCH(1,1) (2) LSTM.

Statistic Result

Compared with representative baselines, PatchTST-Fusion achieves the best overall statistical performance, demonstrating stronger predictive accuracy and robustness in high-frequency realized volatility forecasting.

statistic_result

Economic Result

PatchTST-Fusion also shows superior economic performance, indicating that its forecasts are effective in practical finance-oriented evaluation settings.

economic_result

Ablation Study

To verify the effectiveness of PatchTST-Fusion, we perform ablation studies on both model design and feature selection, along with an attribution analysis of the learned attention weights.

Model-Level Ablation

We remove key modules to assess their contributions.

Model-Level Ablation

Feature-Level Ablation

We retrain the model using only the top ten features selected by additive attention scores. The strong performance of this reduced feature set validates the effectiveness of the learned feature selection mechanism.

top10-feature

Attribution Analysis

We analyze the attention weights to investigate the model’s feature preferences and assess whether they are aligned with economic theory.

Attribution Analysis

Getting Started

  1. Install requirements: pip install -r requirements.txt
  2. Download data. You can download the dataset from the original source and place it under the ./data directory before training or evaluation.
  3. All training-related scripts are provided under ./scripts/PatchTST-Fusion/. Before running the script, please make sure that:
    • the task setting (M, S, MS, or MD) is correctly specified;
    • the input feature dimensions is equal to ENC_IN param;
    • the first 4 columns are non-numeric features;
    • the DATA_PATH param is correctly configured.

You can launch training with:

sh ./scripts/PatchTST-Fusion/run_PatchTST-Fusion.sh

You can open ./result.txt to see the results once the training is done.

Acknowledgement

Data Source

The data that support the findings of this study were obtained from the RESSET Database (www.resset.com). The authors gratefully acknowledge the RESSET Database for data support.

Code Base

We appreciate the following github repo very much for the valuable code base:

https://github.com/yuqinie98/PatchTST

https://github.com/ts-kim/RevIN

https://github.com/thuml/Time-Series-Library

https://github.com/Thinklab-SJTU/Crossformer

https://github.com/vivva/DLinear

https://github.com/thuml/iTransformer

Contact

If you have any questions or concerns, feel free to contact us: isyaozong.zhang@gmail.com or submit an issue.

About

An offical implementation of PatchTST-Fusion

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors