Getting started with Jetson Nano and Donkey car aka Autonomous Car

Jetson Nano

I start this project with 2 of my brilliant friends Felix and Marco who have rich experience in machine learning. We had participated in an Autonomous car competition and we tried a lot of different things from pixel value thresholding, style transfer, and sequential model etc. You can check our works and this wonderful article written by Felix.

One of my friend Jonathan had set up a store for purchasing a full set of donkey car. If you want to save some time from gathering the individual parts, this is perfect for you.

update:I have fix a few bug/typo and upload an image. You can find the image in the bottom of the article.

update (18/10/2019):
Serval command have been updated to accommodate for Jetson Nano SD card Image JP4.2.2. Please also check the official donkeycar documentation for more information.

  • Disabling the rtl8192cu buggy driver seems not working really well and the bit rate is limited to 1Mb/s for new image. Please follow the link to reinstall the driver.
  • Nvidia have released support to the new Tensorflow and Pytorch version specifically for Jetson. Some installation procedure are being updated.

Jetson Nano is a powerful and efficient single board computer made for (buzzword alert) AI on the edge. It is just USD 99 and it provides all the possibility for the Maker community to harness the power of machine learning.

I have been playing around with Donkey car for some time using Raspberry Pi. I absolutely love it and appreciate the effort from the community. I am able to train it with simple CNN but the computational power soon falls short when I add more sensors, for example, IMU, Lidar. And a computationally intensive model will not have a good framerate or even not able to run on the Pi. I need something which is more powerful but not so expensive. 😛 Something below USD 100.

And here it is, the Jetson Nano. Unlike Raspberry Pi, Jetson Nano is released just a few weeks ago and there are little tutorials and projects about it. I had a difficult time to setup the Donkey car and decided to write a (and my first😆) tutorial on how to set things up. Let’s get started.

For the Jetson Nano part, you will need a Jetson Nano, micro SD card and a wifi USB dongle

I am pretty surprised that the dev kit doesn’t come with onboard wifi and Bluetooth. I followed the advice from a tutorial from Nvidia to write the Image to SD card and buy the suggested wifi USB dongle: Edimax EW-7811Un.

All you need to do is to follow the Nvidia tutorial and boot up the device. Once you see the welcoming screen, Congratulations!

However… While I was testing it, the wifi keeps disconnecting every several minutes and I cannot download and install the package that I needed. I spent 2 days 😕 trying to find a solution and the following command will make life easier.

This disables the buggy driver and the wifi seems to return to normal but the bit rate is limited to only 1Mb/s. Another (better) solution is to recompile the driver. Please follow the instruction from this git repo and reinstall the driver. You may need to look into the trouble shoot section to disable the power management function to make the wifi works probably.

Start installing the package

This is an embedded system dedicated to Machine Learning. It won’t be completed without machine learning framework! You can find the following information in Nvidia forums too.

Let’s start with Tensorflow first!

It is going to take a long time and things will seem frozen. It took around 45 min on my machine to set things up.

Remember to test things to ensure it is properly installed. Make sure there are no error messages.

Why not install Pytorch too?

Again, test things before moving forward

Last but not least, Keras and we will need it for the Donkey car.

pip install doesn’t work for me, so I use the following method:

Test:

Software part mostly finished. Let’s go to the hardware part.

I am not going to go through the Donkey car installing procedures step by step. For newcomers, please visit here for more information. I am going to highlight some key points to make things work on Jetson Nano.

  1. PCA9685 PWM driver
  2. Camera

First, update GPIO library

Nivida had already provided a GPIO library and what amazing is it has the same API for RPi.GPIO. So almost nothing needs to be changed to port RPi library to Jetson Nano. Follow the instructions from Nivida GitHub to install the library and you can test the GPIO too. Remember to set the gpio group for the user as well.

Second, install the PCA9685 servo library for controlling the steering and throttle

Connect the PCA9685 to the Jetson nano. You should be able to see the pin number from the silkscreen mark.

And as usual, test the connection.

Look at address 0x40. It is our PCA9685. For those who want to know more about I2C protocol, you can visit here.

To access the I2C channel, the user will need to be added to the I2C group. You will need to reboot to activate it.

Check the group setting for the user:

There it is.

Still with me? The next step is to set up the camera. The bad news is Pi camera 1 doesn’t work with Jetson Nano. And also the wide-angle camera I had been using for a long time.

The purchase guide is to avoid any OV5647 chip for the camera and use the one will IMX219 chip. The IMX219 driver is pre-installed in the image.

Open the lock, put the cable in the slot, close the lock, Done. Just to be careful of the cable orientation. You can look into the connector where the pins are facing. Power things up and everything should be fine. You can check this tutorial to see how to play with the camera.

The last step: installed the Donkeycar module

I have forked the original donkey car repo and made the necessary changes to make things work. You can download the Donkey car library from my repo and start installing the package.

For those who are interested. I edit the following things.

  1. Add a new camera class
  2. Add a default bus to Actuator parts
  3. Add Int typecasting in Keras.py for a variable to make training works

After a long wait (~45min) for setting up different necessary packages, you can start creating your own car folder with the following command.

You need to make a few changes to manag.py to use the new camera.

And also add the new camera parts to the vehicle.

Everything should work just fine to this step and you can start driving and creating your own dataset!

You can then log into the web server and control the car.

And the most exciting part! You can train your car locally on Jetson Nano!

Have fun!

Edit:

  1. While I am writing this article, I found a USB Bluetooth dongle in my trash box. I plug it in Jetson Nano and it works! And I don’t need to do any extra setting to connect it to my PS4 controller. Yay! You can use the controller following this tutorial. It is critical to get a good dataset and a proper controller will help a lot.
  2. The steering and throttle seem to jam each other. The reason behind is the power-hungry Jetson Nano and motor. The momentary current draw is too large that a significant voltage drop affects the ESC signal. There are two solutions: Use a separate power supply for servo driver or add a large capacitor to prevent voltage drop. I prefer the latter one but I don't have any cap at the moment so you can see from the photo that I cut a USB cable, solder 2 jumper wires on it, and connect it to the Servo driver.

3. If you want the whole SD card Image, you can find it here. The ID and password are both “donkey”.

4. You may also want to create a swap partition/file for JN since it only have 4GB of memeory.

Hongkonger, Maker, Teacher. Interested in all kind of stuff, from physics to Machine Learning. Lead Engineer in 2019 CES innovation award honoree.