đ ATI: Any Trajectory Instruction for Controllable Video Generation
This is a trajectory-based motion control framework that unifies object, local and camera movements in video generation, based on the Wan2.1 official implementation.
ATI: Any Trajectory Instruction for Controllable Video Generation
Angtian Wang, Haibin Huang, Jacob Zhiyuan Fang, Yiding Yang, Chongyang Ma,
Intelligent Creation Team, ByteDance

This repository is for Wan2.1 ATI (Any Trajectory Instruction for Controllable Video Generation), a trajectory-based motion control framework that unifies object, local, and camera movements in video generation. It is based on the Wan2.1 official implementation. Code: https://github.com/bytedance/ATI
đ Quick Start
⨠Features
- A trajectory-based motion control framework for video generation.
- Unifies object, local, and camera movements.
- Based on the Wan2.1 official implementation.
đĻ Installation
ATI requires the same environment as the official Wan 2.1. Follow the instructions in INSTALL.md (Wan2.1).
git clone https://github.com/bytedance/ATI.git
cd ATI
Install packages:
pip install .
First, you need to download the 14B original model of Wan2.1:
huggingface-cli download Wan-AI/Wan2.1-I2V-14B-480P --local-dir ./Wan2.1-I2V-14B-480P
Then, download the ATI-Wan model from our Hugging Face repo:
huggingface-cli download bytedance-research/ATI --local-dir ./Wan2.1-ATI-14B-480P
Finally, copy VAE, T5, and other misc checkpoints from the original Wan2.1 folder to the ATI checkpoint location:
cp ./Wan2.1-I2V-14B-480P/Wan2.1_VAE.pth ./Wan2.1-ATI-14B-480P/
cp ./Wan2.1-I2V-14B-480P/models_t5_umt5-xxl-enc-bf16.pth ./Wan2.1-ATI-14B-480P/
cp ./Wan2.1-I2V-14B-480P/models_clip_open-clip-xlm-roberta-large-vit-huge-14.pth ./Wan2.1-ATI-14B-480P/
cp -r ./Wan2.1-I2V-14B-480P/xlm-roberta-large ./Wan2.1-ATI-14B-480P/
cp -r ./Wan2.1-I2V-14B-480P/google ./Wan2.1-ATI-14B-480P/
đģ Usage Examples
Basic Usage
We provide a demo script to run ATI:
bash run_example.sh -p examples/test.yaml -c ./Wan2.1-ATI-14B-480P -o samples
where -p
is the path to the config file, -c
is the path to the checkpoint, -o
is the path to the output directory, -g
defines the number of GPUs to use (if unspecified, all available GPUs will be used; if 1
is given, it will run in single-process mode).
Once finished, you can expect to find:
samples/outputs
for the raw output videos.
samples/images_tracks
shows the input image together with the user-specified trajectories.
samples/outputs_vis
shows the output videos together with the user-specified trajectories.
Expected results:
Input Image & Trajectory |
Generated Videos (Superimposed Trajectories) |
 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
 |
Advanced Usage
Create Your Own Trajectory
We provide an interactive tool that allows users to draw and edit trajectories on their images:
- First run:
cd tools/trajectory_editor
python3 app.py
Then open this URL localhost:5000 in the browser. Note that if you run the editor on the server, you need to replace localhost
with the server's IP address.
-
Get the interface shown below, then click Choose File to open a local image.

-
Available trajectory functions:

a. Free Trajectory: Click and then drag with the mouse directly on the image.
b. Circular (Camera Control):
- Place a circle on the image, then drag to set its size for frame 0.
- Place a few (3â4 recommended) track points on the circle.
- Drag the radius control to achieve zoom-in/zoom-out effects.
c. Static Point: A point that remains stationary over time.
Note: Pay attention to the progress bar in the box to control motion speed.

-
Trajectory Editing: Select a trajectory here, then delete, edit, or copy it. In edit mode, drag the trajectory directly on the image. The selected trajectory is highlighted by color.

-
Camera Pan Control: Enter horizontal (X) or vertical (Y) speed (pixels per frame). Positive X moves right; negative X moves left. Positive Y moves down; negative Y moves up. Click Add to Selected to apply to the current trajectory, or Add to All to apply to all trajectories. The selected points will gain a constant pan motion on top of their existing movement.

-
Important: After editing, click Store Tracks to save. Each image (not each trajectory) must be saved separately after drawing all trajectories.

-
Once all edits are complete, locate the videos_example
folder in the Trajectory Editor.
đ Documentation
This section provides detailed information about the project, including its features, installation steps, usage examples, and more.
đ§ Technical Details
The project is based on the Wan2.1 official implementation and uses a trajectory-based motion control framework to unify object, local, and camera movements in video generation.
đ License
This project is licensed under the Apache-2.0 license.
Citation
Please cite our paper if you find our work useful:
@article{wang2025ati,
title={{ATI}: Any Trajectory Instruction for Controllable Video Generation},
author={Wang, Angtian and Huang, Haibin and Fang, Zhiyuan and Yang, Yiding, and Ma, Chongyang},
journal={arXiv preprint},
volume={arXiv:2505.22944},
year={2025}
}
Information Table
Property |
Details |
Model Type |
Wan2.1 ATI (Any Trajectory Instruction for Controllable Video Generation) |
Training Data |
Not provided in the original README |