Introduction
In our previous entry we showed how to get Autoware.AI running on the Qualcomm® Robotics (RB3) Dragonboard-845c Development Platform. In this post we will look at Autoware.Auto and how to bridge Autoware.AI and Autoware.Auto in the same way as we did for the Hikey970.
The post is organized as follows:
Requirements
The steps outlined in this blog posts build on our previous posts and as such you need to:
- Have your RB3 board running with Debian Buster as outlined here and with Docker installed.
- Be familiar with the work that we conducted previously within the “Autoware everywhere” series on Autoware.Auto and how to bridge both on the Hikey970.
In addition, if you plan on developing real-time applications in the future your board should be running a RT-enabled kernel as we outlined here.
For visualization purposes we will also use a separate laptop, where we need to have ROS2 available.
Getting the Docker images
We need to get the following Docker images:
$ docker pull 96boards/autoware:auto_20200121
$ docker pull 96boards/ros:ros1_bridge
Getting the demo data
In addition to the Docker images we need to get the demo data for the Autoware.Auto 3D Perception Stack demo. To do so:
- Move into the
shared_dir
folder that we created as part of the previous post and download the demo data and parameters file:
$ cd ~/shared_dir
$ wget http://people.linaro.org/~servando.german.serrano/autoware/autoware.auto_get_demo_data
$ chmod +x autoware.auto_get_demo_data
$ ./autoware.auto_get_demo_data
$ wget http://people.linaro.org/~servando.german.serrano/autoware/vlp16_test.param_ai.yaml
Running the demos
We are now set to start with the different demos. Let’s find out how far we can push the RB3 board.
.Auto 3D Perception demo
To run the Autoware.Auto 3D Perception Stack demo we will roughly follow the steps here but adapted to our setup.
In the laptop we get the config files and open 2 terminals to run 2 instances of rviz2
as:
$ wget https://gitlab.com/autowarefoundation/autoware.auto/AutowareAuto/raw/master/src/tools/autoware_auto_examples/rviz2/autoware.rviz
$ wget https://gitlab.com/autowarefoundation/autoware.auto/AutowareAuto/raw/master/src/tools/autoware_auto_examples/rviz2/autoware_voxel.rviz
- Laptop: Terminal 1:
$ rviz2 -d autoware.rviz
- Laptop: Terminal 2:
$ rviz2 -d autoware_voxel.rviz
Now we need to ssh
into the RB3 in 4 terminals and do the following:
- In Terminal 1:
$ docker run -it --rm --privileged --net=host -u linaro -v ~/shared_dir:/home/linaro/shared_dir:rw 96boards/autoware:auto_20200121 /bin/bash
$ cd ~
$ source AutowareAuto/install/setup.bash
- In terminals 2 to 4 we need to access the running container as we did here but using the
linaro
user:
$ docker exec -it -u linaro CONTAINER_NAME /bin/bash
$ cd ~
$ source AutowareAuto/install/setup.bash
And then:
- RB3: Terminal 1:
$ udpreplay ~/shared_dir/route_small_loop_rw-127.0.0.1.pcap
- RB3: Terminal 2:
$ ros2 run velodyne_node velodyne_cloud_node_exe __params:=/home/"${USER}"/AutowareAuto/src/drivers/velodyne_node/param/vlp16_test.param.yaml
- RB3: Terminal 3:
$ ros2 run ray_ground_classifier_nodes ray_ground_classifier_cloud_node_exe __params:=/home/"${USER}"/AutowareAuto/src/perception/filters/ray_ground_classifier_nodes/param/vlp16_lexus.param.yaml
- RB3: Terminal 4:
$ ros2 run voxel_grid_nodes voxel_grid_cloud_node_exe __params:=/home/"${USER}"/AutowareAuto/src/perception/filters/voxel_grid_nodes/param/vlp16_lexus_centroid.param.yaml
If everything went fine we will be able to visualize the demo point cloud and downsampled one in the running rviz2
GUIs as can be seen in the image below.
.AI and .Auto bridge
We will now replicate the steps that were outlined here for the Hikey970. Hence, for a full description of the steps please follow that blog post.
As we did for the Hikey970, we will use 5 terminals in the RB3, terminals 1 and 2 for the .AI container, terminal 3 for the ros1_bridge container and 4 and 5 for the .Auto container.
.AI terminals
- Terminal 1
$ cd docker/generic
$ ./run.sh -c off -i autoware/arm64v8 -t 1.13.0
$ mkdir ~/.autoware
$ cp -r ~/shared_dir/data ~/.autoware/data
$ roscore &
$ rosparam load ~/shared_dir/headless_setup.yaml &
$ roslaunch lidar_localizer ndt_mapping.launch
- Terminal 2
$ docker exec -it -u autoware CONTAINER_NAME /bin/bash
$ cd ~
$ source Autoware/install/setup.bash
ros1_bridge terminal
- Terminal 3
$ docker run -it --rm --privileged --net=host -u linaro 96boards/ros:ros1_bridge /bin/bash -c "source /opt/ros/melodic/setup.bash && source /opt/ros/dashing/local_setup.bash && ros2 run ros1_bridge dynamic_bridge --bridge-all-topics"
.Auto terminals
- Terminal 4
$ docker run -it --rm --privileged --net=host -u linaro -v ~/shared_dir:/home/linaro/shared_dir:rw 96boards/autoware:auto_20200121 /bin/bash
$ source AutowareAuto/install/setup.bash
$ ros2 run velodyne_node velodyne_cloud_node_exe __params:=/home/"${USER}"/shared_dir/vlp16_test.param_ai.yaml
- Terminal 5
$ docker exec -it -u linaro CONTAINER_NAME /bin/bash
$ source AutowareAuto/install/setup.bash
$ udpreplay ~/shared_dir/route_small_loop_rw-127.0.0.1.pcap
A typical look of the terminals running the different nodes is shown in the image below.
Stopping the mapping process
The computational needs for the mapping task are quite heavy and so the processing will take longer as the number of points in the map grows. We can stop the pcap replaying by doing Ctrl+C
in terminal 5. To save the map, in terminal 2 we do:
$ rostopic pub /config/ndt_mapping_output autoware_config_msgs/ConfigNDTMappingOutput "header:
seq: 0
stamp:
secs: 0
nsecs: 0
frame_id: 'map'
filename: 'auto_map.pcd'
filter_res: 0.2"
Once the mapping process is complete the pcd
map will be generated in the .ros
folder. We need to move it to shared_dir
if we want to keep it after we stop the container.
$ mv ~/.ros/auto_map.pcd ~/shared_dir/
Visualizing the pointcloud map
To visualize the pointcloud map we need to:
scp
thepcd
from the RB3 to the laptop and store it inshared_dir
.- Start an Autoware.AI container as we did here.
$ cd ~/docker/generic
$ ./run.sh -c off -t 1.13.0
$ cd shared_dir
$ roscore &
$ rosrun map_file points_map_loader noupdate `pwd`/auto_map.pcd &
$ rviz
Within rviz
we can select the /points_map
topic and, as we did previously, increase the Size
to 0.1. The result is displayed below.
Conclusion
With this post we now have 2 different boards (Hikey970 and Qualcomm® Robotics (RB3) Dragonboard-845c Development Platform) that we can use for development of Autoware.
This article is Part 5 in a 15-Part Series.
- Part 1 - 96boards: Autoware everywhere | Autoware.AI and Hikey970
- Part 2 - 96boards: Autoware everywhere | Autoware.Auto and Hikey970
- Part 3 - 96boards: Autoware everywhere | Bridging .AI and .Auto in the Hikey970
- Part 4 - 96boards: Autoware everywhere | Autoware.AI and Dragonboard-845c
- Part 5 - 96boards: Autoware everywhere | Autoware.Auto, bridge with .AI and Dragonboard-845c
- Part 6 - 96boards: Autoware everywhere | Binding Autoware.AI nodes to CPUs
- Part 7 - 96boards: Autoware everywhere | Defaulting to Cyclone DDS
- Part 8 - 96boards: Autoware everywhere | First look at AutoCore's PCU
- Part 9 - 96boards: Autoware everywhere | meta-arm-autonomy in AutoCore's PCU
- Part 10 - 96boards: Autoware everywhere | Running Cyclone DDS on Kubernetes
- Part 11 - 96boards: Autoware everywhere | Xenomai on PCU
- Part 12 - 96boards: Autoware everywhere | K8s-based Autoware deployment on PCU
- Part 13 - 96boards: Autoware everywhere | Autoware.Auto 3D Perception Stack using k8s on PCU
- Part 14 - 96boards: Autoware everywhere | Multi-board Autoware.Auto 3D Perception Stack using k8s
- Part 15 - 96boards: Autoware everywhere | Updating Autoware.Auto 3D Perception Stack modules