Skip to content

This repository is dedicated to the A.J.G.A.R project, which features a 6 Degrees of Freedom (DoF) robotic arm. A.J.G.A.R is designed to autonomously handle objects, utilizing computer vision technology for this purpose. It also supports remote operation via teleoperation.

Notifications You must be signed in to change notification settings

atom-robotics-lab/robotic-arm-atom

Repository files navigation


Logo

A.J.G.A.R : The Robotic ARM

output.webm
Table of Contents
  1. About The Project
  2. Getting Started
  3. Package Description
  4. Contributing
  5. Contact
  6. Acknowledgments

About The Project

This repository is dedicated to the A.J.G.A.R project, which features a 6 Degrees of Freedom (DoF) robotic arm. A.J.G.A.R is designed to autonomously handle objects, utilizing computer vision technology for this purpose. It also supports remote operation via teleoperation.

Hardware Demo video

Built With

ROS OpenCV Blender Raspberry Pi Arduino Ubuntu Python

Report Bug · Request Feature

Getting Started

This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.

There are two ways to execute this project locally: directly on Ubuntu 20.04 or through Docker.

1. Installation

Ubuntu 20.04

    This is an example of how to list things you need to use the software and how to install them.

    1. Prerequisites

    ROS 1 Noetic

    • Refer to our ROS installation guide

    • Installing ROS Controller dependencies

      sudo apt-get install ros-noetic-ros-control ros-noetic-ros-controllers
    • Installing Freenect dependencies

      sudo apt install libfreenect-dev
      sudo apt-get install ros-noetic-rgbd-launch
    • Opencv

      sudo apt install libopencv-dev python3-opencv
    • MoveIt!

      sudo apt install ros-noetic-moveit

    2. Installation

    1. Create ROS Workspace - robotic_arm_ws

      cd ~
      mkdir robotic_arm_ws/src
    2. Clone the repo inside your Ros Workspace

      cd ~/robotic_arm_ws/src
      git clone [email protected]:atom-robotics-lab/robotic-arm-atom.git
    3. Install the Python dependencies

       pip install -r requirements.txt
    4. Build the package

      cd ~/robotic_arm_ws
      catkin_make
    5. Launch the packages file by

      roslaunch <package_name> <launch_file>
Docker

    Docker Installation

    1. Create ROS Workspace - robotic_arm_ws

      cd ~
      mkdir robotic_arm_ws
    2. Clone the repo inside your ROS Workspace

      cd ~/robotic_arm_ws/src
      git clone [email protected]:atom-robotics-lab/robotic-arm-atom.git
    3. Install Docker from here

    4. Execute the following command to run Docker without using sudo

      sudo systemctl enable docker.service
      sudo systemctl enable containerd.service
      
      sudo groupadd docker
      sudo usermod -aG docker $USER
      
      docker context use default
      
      newgrp docker
    5. Install the nvidia-container-toolkit from here

    6. The command below will build the Docker image

      ./build_image.sh
    7. Run the image using this command

      ./run_image.sh

2. Run Simulation

  1. To launch AJGAR model in RViz , Gazebo and MoveGroup :
roslaunch ajgar_core  ajgar_moveit.launch
  1. To initialize and start all the services :
roslaunch ajgar_core ajgar_rosservice.launch
  1. To start the pick-and-place procedure :
rosrun ajgar_perception server.py

Description

Package Description
    Package Description
    ajgar_core This package contains the core functionalities of the robotic arm, including the main control algorithms and launch files.
    ajgar_description This package contains the URDF (Unified Robot Description Format) files for the robotic arm. These files describe the robot's physical configuration.
    ajgar_hardware This package is responsible for interfacing with the physical hardware of the robotic arm. It includes drivers and communication protocols.
    ajgar_moveit_config This package contains the configuration files for MoveIt, a ROS-based software for motion planning, kinematics, and robot interaction.
    ajgar_perception This package is responsible for the perception tasks, such as object recognition and environment mapping.
    ajgar_sim This package contains the simulation environment for the robotic arm. It includes models and simulation worlds.
    ajgar_sim_plugins This package contains plugins for the simulation environment, currently providing suction functionality.

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

For more info refer to contributing.md

Contacts

Our Socials - Linktree - [email protected]

Acknowledgement

About

This repository is dedicated to the A.J.G.A.R project, which features a 6 Degrees of Freedom (DoF) robotic arm. A.J.G.A.R is designed to autonomously handle objects, utilizing computer vision technology for this purpose. It also supports remote operation via teleoperation.

Topics

Resources

Stars

Watchers

Forks