As a base code for our implementation we used PointFlow implementation published here.
- Python 3.6
- CUDA 10.0.
- G++ or GCC 5.
- PyTorch. Codes are tested with version 1.0.1
- torchdiffeq.
Following is the suggested way to install these dependencies:
# Create a new conda environment
conda create -n HyperFlow python=3.6
conda activate HyperFlow
# Install pytorch (please refer to the commend in the official website)
conda install pytorch=1.0.1 torchvision cudatoolkit=10.0 -c pytorch -y
# Install other dependencies such as torchdiffeq, structural losses, etc.
./install.shThe point clouds are uniformly sampled from meshes from ShapeNetCore dataset (version 2) and use the official split.
Please use this link to download the ShapeNet point cloud.
The point cloud should be placed into data directory.
mv ShapeNetCore.v2.PC15k.zip data/
cd data
unzip ShapeNetCore.v2.PC15k.zipExample of training script:
# Training setting for single airplane class. For other classes just chagne --cates parameter
./train.sh Pretrained models are located in pretrained_model folder.
# Evaluate the generative performance of HyperFlow trained on the airplane, car and chair categories for various variances (variance equal 0 reffers to mesh representation).
CUDA_VISIBLE_DEVICES=0 ./run_test_airplane.sh
CUDA_VISIBLE_DEVICES=0 ./run_test_car.sh
CUDA_VISIBLE_DEVICES=0 ./run_test_chair.shThe demo relies on Open3D. The following is the suggested way to install it:
conda install -c open3d-admin open3d The demo will sample shapes from a pre-trained model, save those shapes under the supplementary folder, and visualize
meshes, point clouds, and raw data for point clouds. Meshes are generated with various radius sizes. There 6 demonstration options
controlled by --demo_mode argument:
0 - visualizing steps for transforming surface to mesh
1 - visualizing interpolations between two meshes
2 - visualizing generated meshes
3 - visualizing steps for transforming samples from logNormal to point cloud
4 - visualizing interpolations between two point clouds
5 - visualizing generated point clouds from logNormal
Once this dependency is in place, you can use the following script to use the demo for the pre-trained model for airplanes:
CUDA_VISIBLE_DEVICES=0 ./run_demo_airplane.sh