Skip to content

Latest commit

 

History

History
55 lines (40 loc) · 1.87 KB

README.md

File metadata and controls

55 lines (40 loc) · 1.87 KB

PyTorch Implementation of FusionGAN

This repository contains the code for implementation of the FusionGAN model described in the paper Generating a Fusion Image: One's Identity and Another's Shape

Dependencies

The following are the dependencies required by this repository:

  • PyTorch v0.4
  • NumPy
  • SciPy
  • Pickle
  • PIL
  • Matplotlib

Setup Instructions

First, download the repository on your local machine by either downloading it or running the following script in the terminal

git clone https://github.com/aarushgupta/FusionGAN.git

Next, go through the instructions mentioned in the Dataset Preparation section.

Dataset Preparation

As the data is not publically available in the desired form, the frames of the required YouTube videos have been saved at this Google Drive link.

The link contains a compressed train folder which has the following 3 folders:

  1. class1_cropped
  2. class2_cropped
  3. class3_cropped

Download the data from the link and put the folders according to the following directory structure:

/FusionGAN_root_directory/Dataset/train/class1_cropped
/FusionGAN_root_directory/Dataset/train/class2_cropped
/FusionGAN_root_directory/Dataset/train/class3_cropped

Training Instructions

The hyperparameters of the model have been preset. To start training of the model, simply run the train.py file using the following command

python train.py

The code can also be run interactively using the train.ipynb Jupyter Notebook provided in the repository.

To-Do

  1. Train the model and add checkpoints.
  2. Polish and add code for preparing the dataset.
  3. Add test script for the model.
  4. Add keypoint estimation for quantitative evaluation.
  5. Remove the unused images in the dataset.