RB5 comes with an Inertial Measurement Unit (IMU). An IMU estimates the linear acceleration and angular velocity. These capabilities enable applications such as motion estimation and visual inertial SLAM, which are essential to many robotics tasks. The IMU can be accessed by a ROS2 node.
Assuming you have ROS2 dashing installed, then you can run the IMU node.
1 2 3 4 5
# source the ros dashing environment source /opt/ros/dashing/setup.bash
# run the node ros2 run imu-ros2node imu-ros2node
Check the published IMU messages in a new terminal.
1 2 3 4 5
# In a new terminal, source the environment. source /opt/ros/dashing/setup.bash
ros2 topic list # list the topic ros2 topic echo /imu # print the messages
The LU build outlined during in the bring-up process comprises of a minimal Ubuntu 18.04 installation. For this reason, various device kernel modules need to be build from source and loaded. In this tutorial, we document the process of building and loading the kernel modules for a USB joystick (joydev) and USB over serial (ch341). The source code associated with these modules is open-source and available as part of the linux kernel. We suggest use the USB-C cable to connect your RB5 to the computer so that you can copy large block of code over. Make sure you check the correctness of the format.
joydev
The kernel version utilized for this tutorial corresponds to 4.19.125. If a different version is being used, you can find the version that matches your kernel by utilizing uname -r.
Extract the source code associated with this module, copy it to a temporary directory, for example, we use directory joydev.
1 2 3 4 5
wget https://cdn.kernel.org/pub/linux/kernel/v4.x/linux-4.19.125.tar.gz tar xvzf linux-4.19.125.tar.gz
mkdir joydev cp -r linux-4.19.125/drivers/input/* joydev/ && cd joydev/
The following code need to be appended to the end of the Makefile that was copied to the temporary directory joydev.
A key sensor for many robotics applications is camera. It enables cool applications such as object detection, semantic segmentation and visual SLAM. There are two cameras on Qualcomm Robotics RB5.
This tutorial will tell you a few ways to access these cameras. Before we start, note that these cameras cannot be read from OpenCV directly but a tool called GStreamer can bridge the gap.
OpenCV access through GStreamer and tcp
The easiest way to access camera is through a tcp port created by GStreamer. Then you can use OpenCV to read data from the tcp port.
On Qualcomm Robotics RB5, run the following command:
Note that you will need to change the host ip to your Qualcomm Robotics RB5’s IP address. This can be done by running the following command.
1 2
sudo apt install net-tools # if you don't have ifconfig ifconfig
The ip address of Qualcomm Robotics RB5 can be found after inet as something like 192.168.0.xxx.
Then you can access the camera with the help of the OpenCV library. A python example is given below
1 2 3 4 5 6 7 8 9
import cv2
cap = cv2.VideoCapture("tcp://192.168.1.120:8900") #rb5 ip & port (same from command) while(True): ret, frame = cap.read() cv2.imwrite("captured_image_opencv.jpg",frame) # you can process image by using frame break cap.release()
Again, make sure you change the host ip to your Qualcomm Robotics RB5 IP.
accessing camera using ROS or ROS2 packages
Another way is to use the ROS packages we provided both in ROS1 and ROS2.
We provided launch file for both ROS and ROS2 packages so that it is easy to config the camera node.
Access the Camera in ROS1
For ROS1 package, first clone it to your workspace.
1 2 3 4 5 6 7 8
# If you don't already have a works space, create one first. mkdir -p rosws/src
# Go to the source folder cd rosws/src
# Clone the repository git clone https://github.com/AutonomousVehicleLaboratory/rb5_ros
Then, build your package.
1 2 3 4 5 6 7 8
# Return to the root folder of the workspace. cd ..
# Include the ros tools, this assumes you have ROS1 melodic installed source /opt/ros/melodic/setup.bash
# Build only this package catkin_make --only-pkg-with-deps rb5_vision
Then you can run the package. For example, you can run the RGB camera by
1 2 3 4 5
# source the ros workspace source devel/setup.bash
# start the program with a set of parameters in the launch file roslaunch rb5_vision rb_camera_main_ocv.launch
This will publish images to the topic /camera_0.
start a new terminal and check with the following command.
1 2 3 4
source /opt/ros/melodic/setup.bash
rostopic list # list all the topics rostopic hz /camera_0 # get the frequency of the topic
Finally, you can stop the process by pressing Ctrl + C in the terminal where it is running. Notice that it will take a few seconds for it to stop.
Similarly, you can run the tracking camera by
1
roslaunch rb5_vision rb_camera_side_ocv.launch
And this will publish images to the topic /camera_1.
Access the camera in ROS2
For ROS2 package, first clone it to your workspace.
1 2 3 4 5 6 7 8
# If you don't already have a works space, create one first. mkdir -p ros2ws/src
# Go to the source folder cd ros2ws/src
# Clone the repository git clone https://github.com/AutonomousVehicleLaboratory/rb5_ros2
Then, build your package.
1 2 3 4 5 6 7 8
# Return to the root folder. cd ..
# Include the ros tools, this assumes you have ROS2 dashing installed source /opt/ros/dashing/setup.bash
# Build only this package colcon build --packages-select rb5_ros2_vision
Then you can run the package.
1 2 3 4 5 6 7 8
# Source the ROS2 workspace source install/setup.bash
# Run the RGB camera ros2 launch rb5_ros2_vision rb_camera_main_ocv_launch.py
# Or run the tracking camera ros2 launch rb5_ros2_vision rb_camera_side_ocv_launch.py
Again, you can verify that the message is being published using the following command in another new terminal.
1 2 3
source /opt/ros/dashing/setup.bash ros2 topic list # check if the topic is being pubilshed ros2 topic hz /camera_0 # check the frequency of the RGB image message
A set of Robotics Tutorials developed for the RB5 Robotics Development Platform from Qualcomm. Authors are from the Contextual Robotics Institute at UC San Diego.
Contributors
Henrik I. Christensen, David Paz, Henry Zhang, Anirudh Ramesh.