Skip to content

maciejzieba/HyperFlow

Repository files navigation

HyperFlow

Base code

As a base code for our implementation we used PointFlow implementation published here.

Dependencies

  • Python 3.6
  • CUDA 10.0.
  • G++ or GCC 5.
  • PyTorch. Codes are tested with version 1.0.1
  • torchdiffeq.

Following is the suggested way to install these dependencies:

# Create a new conda environment
conda create -n HyperFlow python=3.6
conda activate HyperFlow

# Install pytorch (please refer to the commend in the official website)
conda install pytorch=1.0.1 torchvision cudatoolkit=10.0 -c pytorch -y

# Install other dependencies such as torchdiffeq, structural losses, etc.
./install.sh

Dataset

The point clouds are uniformly sampled from meshes from ShapeNetCore dataset (version 2) and use the official split. Please use this link to download the ShapeNet point cloud. The point cloud should be placed into data directory.

mv ShapeNetCore.v2.PC15k.zip data/
cd data
unzip ShapeNetCore.v2.PC15k.zip

Training

Example of training script:

# Training setting for single airplane class. For other classes just chagne --cates parameter
./train.sh 

Pre-trained model and test

Pretrained models are located in pretrained_model folder.

# Evaluate the generative performance of HyperFlow trained on the airplane, car and chair categories for various variances (variance equal 0 reffers to mesh representation).
CUDA_VISIBLE_DEVICES=0 ./run_test_airplane.sh
CUDA_VISIBLE_DEVICES=0 ./run_test_car.sh
CUDA_VISIBLE_DEVICES=0 ./run_test_chair.sh

Demo

The demo relies on Open3D. The following is the suggested way to install it:

conda install -c open3d-admin open3d 

The demo will sample shapes from a pre-trained model, save those shapes under the supplementary folder, and visualize meshes, point clouds, and raw data for point clouds. Meshes are generated with various radius sizes. There 6 demonstration options controlled by --demo_mode argument:

0 - visualizing steps for transforming surface to mesh

1 - visualizing interpolations between two meshes

2 - visualizing generated meshes

3 - visualizing steps for transforming samples from logNormal to point cloud

4 - visualizing interpolations between two point clouds

5 - visualizing generated point clouds from logNormal

Once this dependency is in place, you can use the following script to use the demo for the pre-trained model for airplanes:

CUDA_VISIBLE_DEVICES=0 ./run_demo_airplane.sh

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published