Rundong Xue, Xiangmin Han*, Hao Hu, Zeyu Zhang, Shaoyi Du*, Yue Gao
Accepted by MICCAI 2025
Figure 1. The framework of the proposed LHDFormer.
Abstract - Dynamic functional brain network analysis using rs-fMRI has emerged as a powerful approach to understanding brain disorders. However, current methods predominantly focus on pairwise brain region interactions, neglecting critical high-order dependencies and time-varying communication mechanisms. To address these limitations, we propose the Long-Range High-Order Dependency Transformer (LHDFormer), a neurophysiologically-inspired framework that integrates multiscale long-range dependencies with time-varying connectivity patterns. Specifically, we present a biased random walk sampling strategy with NeuroWalk kernel-guided transfer probabilities that dynamically simulate multi-step information loss through a
- python=3.9
- cudatoolkit=11.3
- torchvision=0.13.1
- pytorch=1.12.1
- torchaudio=0.12.1
- wandb=0.13.1
- scikit-learn=1.1.1
- pandas=1.4.3
- hydra-core=1.2.0
Download the ABIDE dataset from here.
Run the following command to train the model.
sh main.sh@inproceedings{xue2025adaptive,
title={Adaptive Embedding for Long-Range High-Order Dependencies via Time-Varying Transformer on fMRI},
author={Xue, Rundong and Han, Xiangmin and Hu, Hao and Zhang, Zeyu and Du, Shaoyi and Gao, Yue},
booktitle={International Conference on Medical Image Computing and Computer-Assisted Intervention},
pages={46--55},
year={2025},
organization={Springer}
}The source code is free for research and educational use only. Any commercial use should get formal permission first.
This repo benefits from BNT and ALTER. Thanks for their wonderful works.
