Skip to content

Latest commit

 

History

History
123 lines (86 loc) · 4.04 KB

README.md

File metadata and controls

123 lines (86 loc) · 4.04 KB

此机械臂平面抓取算法是GRCNN,此版本是在原作者基础上加上自己的的一些配置和操作,只有plane_robotic_grasping这个文件夹的所有内容是额外添加的,详情请见plane_robotic_grasping/README.md

以下是原工程的README.md:

Antipodal Robotic Grasping

We present a novel generative residual convolutional neural network based model architecture which detects objects in the camera’s field of view and predicts a suitable antipodal grasp configuration for the objects in the image.

This repository contains the implementation of the Generative Residual Convolutional Neural Network (GR-ConvNet) from the paper:

Antipodal Robotic Grasping using Generative Residual Convolutional Neural Network

Sulabh Kumra, Shirin Joshi, Ferat Sahin

arxiv | video

PWC

If you use this project in your research or wish to refer to the baseline results published in the paper, please use the following BibTeX entry:

@inproceedings{kumra2020antipodal,
  author={Kumra, Sulabh and Joshi, Shirin and Sahin, Ferat},
  booktitle={2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)}, 
  title={Antipodal Robotic Grasping using Generative Residual Convolutional Neural Network}, 
  year={2020},
  pages={9626-9633},
  doi={10.1109/IROS45743.2020.9340777}}
}

Requirements

  • numpy
  • opencv-python
  • matplotlib
  • scikit-image
  • imageio
  • torch
  • torchvision
  • torchsummary
  • tensorboardX
  • pyrealsense2
  • Pillow

Installation

  • Checkout the robotic grasping package
$ git clone https://github.com/skumra/robotic-grasping.git
  • Create a virtual environment
$ python3.6 -m venv --system-site-packages venv
  • Source the virtual environment
$ source venv/bin/activate
  • Install the requirements
$ cd robotic-grasping
$ pip install -r requirements.txt

Datasets

This repository supports both the Cornell Grasping Dataset and Jacquard Dataset.

Cornell Grasping Dataset

  1. Download the and extract Cornell Grasping Dataset.
  2. Convert the PCD files to depth images by running python -m utils.dataset_processing.generate_cornell_depth <Path To Dataset>

Jacquard Dataset

  1. Download and extract the Jacquard Dataset.

Model Training

A model can be trained using the train_network.py script. Run train_network.py --help to see a full list of options.

Example for Cornell dataset:

python train_network.py --dataset cornell --dataset-path <Path To Dataset> --description training_cornell

Example for Jacquard dataset:

python train_network.py --dataset jacquard --dataset-path <Path To Dataset> --description training_jacquard --use-dropout 0 --input-size 300

Model Evaluation

The trained network can be evaluated using the evaluate.py script. Run evaluate.py --help for a full set of options.

Example for Cornell dataset:

python evaluate.py --network <Path to Trained Network> --dataset cornell --dataset-path <Path to Dataset> --iou-eval

Example for Jacquard dataset:

python evaluate.py --network <Path to Trained Network> --dataset jacquard --dataset-path <Path to Dataset> --iou-eval --use-dropout 0 --input-size 300

Run Tasks

A task can be executed using the relevant run script. All task scripts are named as run_<task name>.py. For example, to run the grasp generator run:

python run_grasp_generator.py

Run on a Robot

To run the grasp generator with a robot, please use our ROS implementation for Baxter robot. It is available at: https://github.com/skumra/baxter-pnp