I am testing it on Nova carter.
For setup step, in the vila_setup.sh, the step to install the Fastattention2 is for x86 platfrom.
# Install FlashAttention2
python -m pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.8/flash_attn-2.5.8+cu122torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
How could I install it for jetson?