Skip to content

JYS997760473/ME5413_Final_Project

 
 

Repository files navigation

ME5413_Final_Project

NUS ME5413 Autonomous Mobile Robotics Final Project

Authors: Christina Lee, Dongen Li, Yuhang Han, and Shuo Sun

Ubuntu 20.04 ROS Noetic C++ Python GitHub Repo stars GitHub Repo forks

cover_image

Dependencies

  • System Requirements:
    • Ubuntu 20.04 (18.04 not yet tested)
    • ROS Noetic (Melodic not yet tested)
    • C++11 and above
    • CMake: 3.0.2 and above
  • This repo depends on the following standard ROS pkgs:
    • roscpp
    • rospy
    • rviz
    • std_msgs
    • nav_msgs
    • geometry_msgs
    • visualization_msgs
    • tf2
    • tf2_ros
    • tf2_geometry_msgs
    • pluginlib
    • map_server
    • gazebo_ros
    • jsk_rviz_plugins
    • jackal_gazebo
    • jackal_navigation
    • velodyne_simulator
    • teleop_twist_keyboard
  • And this gazebo_model repositiory

Installation

This repo is a ros workspace, containing three rospkgs:

  • interactive_tools are customized tools to interact with gazebo and your robot
  • jackal_description contains the modified jackal robot model descriptions
  • me5413_world the main pkg containing the gazebo world, and the launch files

Note: If you are working on this project, it is encouraged to fork this repository and work on your own fork!

After forking this repo to your own github:

# Clone your own fork of this repo (assuming home here `~/`)
cd
git clone https://github.com/<YOUR_GITHUB_USERNAME>/ME5413_Final_Project.git
cd ME5413_Final_Project

# Install all dependencies
rosdep install --from-paths src --ignore-src -r -y

# Build
catkin_make
# Source 
source devel/setup.bash

To properly load the gazebo world, you will need to have the necessary model files in the ~/.gazebo/models/ directory.

There are two sources of models needed:

  • Gazebo official models

    # Create the destination directory
    cd
    mkdir -p .gazebo/models
    
    # Clone the official gazebo models repo (assuming home here `~/`)
    git clone https://github.com/osrf/gazebo_models.git
    
    # Copy the models into the `~/.gazebo/models` directory
    cp -r ~/gazebo_models/* ~/.gazebo/models
  • Our customized models

    # Copy the customized models into the `~/.gazebo/models` directory
    cp -r ~/ME5413_Final_Project/src/me5413_world/models/* ~/.gazebo/models

Usage

0. Gazebo World

This command will launch the gazebo with the project world

# Launch Gazebo World together with our robot
roslaunch me5413_world world.launch

1. Manual Control

If you wish to explore the gazebo world a bit, we provide you a way to manually control the robot around:

# Only launch the robot keyboard teleop control
roslaunch me5413_world manual.launch

Note: This robot keyboard teleop control is also included in all other launch files, so you don't need to launch this when you do mapping or navigation.

rviz_manual_image

2. Mapping

After launching Step 0, in the second terminal:

# Launch GMapping
roslaunch me5413_world mapping.launch

After finishing mapping, run the following command in the thrid terminal to save the map:

# Save the map as `my_map` in the `maps/` folder
roscd me5413_world/maps/
rosrun map_server map_saver -f my_map map:=/map

rviz_nmapping_image

3. Navigation

Once completed Step 2 mapping and saved your map, quit the mapping process.

Then, in the second terminal:

# Load a map and launch AMCL localizer
roslaunch me5413_world navigation.launch

rviz_navigation_image

Student Tasks

1. Map the environment

  • You may use any SLAM algorithm you like, any type:
    • 2D LiDAR
    • 3D LiDAR
    • Vision
    • Multi-sensor
  • Verify your SLAM accuracy by comparing your odometry with the published /gazebo/ground_truth/state topic (nav_msgs::Odometry), which contains the gournd truth odometry of the robot.
  • You may want to use tools like EVO to quantitatively evaluate the performance of your SLAM algorithm.

2. Using your own map, navigate your robot

  • From the starting point, move to the given pose within each area in sequence

    • Assembly Line 1, 2
    • Packaging Area 1, 2, 3, 4
    • Delivery Vehicle 1, 2, 3
  • We have provided you a GUI in RVIZ that allows you to click and publish these given goal poses to the /move_base_simple/goal topic:

    rviz_panel_image

  • We also provides you four topics (and visualized in RVIZ) that computes the real-time pose error between your robot and the selelcted goal pose:

    • /me5413_world/absolute/heading_error (in degrees, wrt world frame, std_msgs::Float32)
    • /me5413_world/absolute/position_error (in meters, wrt world frame, std_msgs::Float32)
    • /me5413_world/relative/heading_error (in degrees, wrt map frame, std_msgs::Float32)
    • /me5413_world/relative/position_error (in meters wrt map frame, std_msgs::Float32)

Contribution

You are welcome contributing to this repo by opening a pull-request

We are following:

License

The ME5413_Final_Project is released under the MIT License

About

NUS ME5413 Autonomous Mobile Robotics Final Project

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 76.2%
  • CMake 18.1%
  • Batchfile 4.5%
  • Shell 1.2%