This repository is the project page for the paper Towards Grand Unification of Object Tracking
- Unicorn is accepted to ECCV 2022 as an oral presentation!
 - Unicorn first demonstrates grand unification for four object-tracking tasks.
 - Unicorn achieves strong performance in eight tracking benchmarks.
 
- 
The object tracking field mainly consists of four sub-tasks: Single Object Tracking (SOT), Multiple Object Tracking (MOT), Video Object Segmentation (VOS), and Multi-Object Tracking and Segmentation (MOTS). Most previous approaches are developed for only one of or part of the sub-tasks.
 - 
For the first time, Unicorn accomplishes the great unification of the network architecture and the learning paradigm for four tracking tasks. Besides, Unicorn puts forwards new state-of-the-art performance on many challenging tracking benchmarks using the same model parameters.
 
This repository supports the following tasks:
Image-level
- Object Detection
 - Instance Segmentation
 
Video-level
- Single Object Tracking (SOT)
 - Multiple Object Tracking (MOT)
 - Video Object Segmentation (VOS)
 - Multi-Object Tracking and Segmentation (MOTS)
 
Unicorn conquers four tracking tasks (SOT, MOT, VOS, MOTS) using the same network with the same parameters.
video_demo_unicorn.mp4
- Installation: Please refer to install.md for more details.
 - Data preparation: Please refer to data.md for more details.
 - Training: Please refer to train.md for more details.
 - Testing: Please refer to test.md for more details.
 - Model zoo: Please refer to model_zoo.md for more details.
 
If you find Unicorn useful in your research, please consider citing:
@inproceedings{unicorn,
  title={Towards Grand Unification of Object Tracking},
  author={Yan, Bin and Jiang, Yi and Sun, Peize and Wang, Dong and Yuan, Zehuan and Luo, Ping and Lu, Huchuan},
  booktitle={ECCV},
  year={2022}
}- Thanks YOLOX and CondInst for providing strong baseline for object detection and instance segmentation.
 - Thanks STARK and PyTracking for providing useful inference and evaluation toolkits for SOT and VOS.
 - Thanks ByteTrack, QDTrack and PCAN for providing useful data-processing scripts and evalution codes for MOT and MOTS.
 






