You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a DB21M assembled, getting confused about how to run neural networks on the bot itself. I tried SSHing into the bot but didn't find any NVIDIA drivers installed or CUDA installed? Is there a way to do that easily? Can't find anything useful in the docs talking about this.
From the NVIDIA docs, there should be a way to install JetPack and get some neural networks running out from their examples presented here. I want to use that with ROS on the duckiebot
The text was updated successfully, but these errors were encountered:
After going through a lot of research and pain. I was able to make something work here as part of my thesis project. Feel free to use the Dockerfile in your own project.
It uses dustynv/jetson-inference as a base container which is basically a container having CUDA, PyTorch and some cool DNN models there to be used straight away. It also has ROS melodic on top of that where you can set ROS_MASTER_URI with the duckiebot IP for communication to work.
If you just want PyTorch you can create rather derive from l4t-pytorch. More details about that here.
I have a DB21M assembled, getting confused about how to run neural networks on the bot itself. I tried SSHing into the bot but didn't find any NVIDIA drivers installed or CUDA installed? Is there a way to do that easily? Can't find anything useful in the docs talking about this.
From the NVIDIA docs, there should be a way to install JetPack and get some neural networks running out from their examples presented here. I want to use that with ROS on the duckiebot
The text was updated successfully, but these errors were encountered: