The final project of System Integration from Udacity SDCND program is a good opportunity to peek into a complete autonomous driving system. Although project covers only traffic light detection and control subsystem, one can expand it to include other subsystems given that you have an access to a simulator which responds to these subsystems. Because Udacity simulators in different projects were also different and each was responding to only one subsystem such as path planning with other vehicles, light detection, pid controller. A simulator that combines other vehicles and traffic light detection along with other obstacles such as person, bicycle would be useful to exercise more complex relations and/or dependencies between subsystems.
Recently, I had the chance to see and drive with Carla, Udacity's Self Driving Car, by downloading the code we have written. I was a bit nervous but also excited before Carla started to drive. Luckily, Carla recognized the red light and stopped at the red light but it was fast and furious once light turned green so the driver had to take over the control. Although it was very short time, running the code on Carla was exciting.
To solve the traffic light detection problem, we used the Tensorflow Object Detection API , more specifically SSD Inception V2 Network trained with COCO images, to train and test the system. SSD inception paper can be found here. We transferred original network weights and kept training with real and simulated images for simulator and site operations respectively. Annotated real traffic light images are provided by Anthony Sarkis who is one of the peers in the Udacity Self Driving Car Nanodegree program. We thank him for making his data available to community. We needed to modify the configuration file provided by SSD Inception V2 api to setup number of classes, 4 in our case; red, yellow, green and unknown, batch size and number of training steps. Once tensorflow specific data records file is obtained, system was ready to train and test.
For vehicle speed and position control subsystem, we made use of existing yaw controller to control the steering angle of the vehicle and PID to control brake and throttle. PID output is scaled down by maximum speed when acceleration is needed and scaled up by a constant to decelerate. Deceleration constant is obtained by using vehicle mass, wheel radius, current speed and required speed. Update rate for control subsystem was set at 50Hz as originally set by framework.
This is the project repo for the final project of the Udacity Self-Driving Car Nanodegree: Programming a Real Self-Driving Car. For more information about the project, see the project introduction here.
-
Be sure that your workstation is running Ubuntu 16.04 Xenial Xerus or Ubuntu 14.04 Trusty Tahr. Ubuntu downloads can be found here.
-
If using a Virtual Machine to install Ubuntu, use the following configuration as minimum:
- 2 CPU
- 2 GB system memory
- 25 GB of free hard drive space
The Udacity provided virtual machine has ROS and Dataspeed DBW already installed, so you can skip the next two steps if you are using this.
-
Follow these instructions to install ROS
- ROS Kinetic if you have Ubuntu 16.04.
- ROS Indigo if you have Ubuntu 14.04.
-
- Use this option to install the SDK on a workstation that already has ROS installed: One Line SDK Install (binary)
-
Download the Udacity Simulator.
Build the docker container
docker build . -t capstone
Run the docker file
docker run -p 4567:4567 -v $PWD:/capstone -v /tmp/log:/root/.ros/ --rm -it capstone
- Clone the project repository
git clone https://github.com/udacity/CarND-Capstone.git
- Install python dependencies
cd CarND-Capstone
pip install -r requirements.txt
- Make and run styx
cd ros
catkin_make
source devel/setup.sh
roslaunch launch/styx.launch
- Run the simulator
- Download training bag that was recorded on the Udacity self-driving car (a bag demonstraing the correct predictions in autonomous mode can be found here)
- Unzip the file
unzip traffic_light_bag_files.zip
- Play the bag file
rosbag play -l traffic_light_bag_files/loop_with_traffic_light.bag
- Launch your project in site mode
cd CarND-Capstone/ros
roslaunch launch/site.launch
- Confirm that traffic light detection works on real life images