RB5 comes with an Inertial Measurement Unit (IMU). An IMU estimates the linear acceleration and angular velocity. These capabilities enable applications such as motion estimation and visual inertial SLAM, which are essential to many robotics tasks. The IMU can be accessed by a ROS2 node.

Assuming you have ROS2 dashing installed, then you can run the IMU node.

1
2
3
4
5
# source the ros dashing environment
source /opt/ros/dashing/setup.bash

# run the node
ros2 run imu-ros2node imu-ros2node

Check the published IMU messages in a new terminal.

1
2
3
4
5
# In a new terminal, source the environment.
source /opt/ros/dashing/setup.bash

ros2 topic list # list the topic
ros2 topic echo /imu # print the messages

Comment and share

The LU build outlined during in the bring-up process comprises of a minimal Ubuntu 18.04 installation. For this reason, various device kernel modules need to be build from source and loaded. In this tutorial, we document the process of building and loading the kernel modules for a USB joystick (joydev) and USB over serial (ch341). The source code associated with these modules is open-source and available as part of the linux kernel. We suggest use the USB-C cable to connect your RB5 to the computer so that you can copy large block of code over. Make sure you check the correctness of the format.

joydev

The kernel version utilized for this tutorial corresponds to 4.19.125. If a different version is being used, you can find the version that matches your kernel by utilizing uname -r.

Extract the source code associated with this module, copy it to a temporary directory, for example, we use directory joydev.

1
2
3
4
5
wget https://cdn.kernel.org/pub/linux/kernel/v4.x/linux-4.19.125.tar.gz
tar xvzf linux-4.19.125.tar.gz

mkdir joydev
cp -r linux-4.19.125/drivers/input/* joydev/ && cd joydev/

The following code need to be appended to the end of the Makefile that was copied to the temporary directory joydev.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
KVERS = $(shell uname -r)

# kernel modules
obj-m := joydev.o
#
EXTRA_CFLAGS=-g -O0 -Wno-vla -Wframe-larger-than=4496

build: kernel_modules

kernel_modules:
make -C /usr/src/header M=$(CURDIR) modules

clean:
make -C /usr/src/header M=$(CURDIR) clean

Build and Load kernel module

1
2
make
insmod joydev.ko

To avoid loading the module every time, create a script outside of the directory.

1
2
cd ..
vim joydev.sh

then copy the following script into the file.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
#!/bin/bash

KERNEL_VERSION=$(uname -r)
MODINFO=$(modinfo ./joydev/joydev.ko | grep vermagic)
MODULE_VERSION=$(echo $MODINFO | cut -d " " -f 2)


if [ $KERNEL_VERSION != $MODULE_VERSION ]
then
echo "Versions incompatible"
echo ".ko file compiled with " $MODULE_VERSION
echo "System kernel is " $KERNEL_VERSION
else
mkdir -p /lib/modules/$(uname -r)/kernel/drivers/input/
cp ./joydev/joydev.ko /lib/modules/$(uname -r)/kernel/drivers/input/
depmod -a
echo "JOYDEV loaded"
fi

save the file and execute it with

1
bash joydev.sh

The joydev module will be copied into the kernel directory and dynamically loaded when a joystick device is found.

ch341

Extract the source code associated with this module to a temporary directory, in this case ch341

1
2
3
4
5
6
# Skip these two steps if you already did it
wget https://cdn.kernel.org/pub/linux/kernel/v4.x/linux-4.19.125.tar.gz
tar xvzf linux-4.19.125.tar.gz

mkdir ch341
cp -r linux-4.19.125/drivers/usb/serial/* ch341 && cd ch341

The following code need to be append to the end of the Makefile that was copied to ch341.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
KVERS = $(shell uname -r)

# kernel modules
obj-m := ch341.o

EXTRA_CFLAGS=-g -O0 -Wno-vla

build: kernel_modules

kernel_modules:
make -C /usr/src/header M=$(CURDIR) modules

clean:
make -C /usr/src/header M=$(CURDIR) clean

Build and Load kernel module

1
2
make
insmod ch341.ko

To avoid loading the module every time, create a script outside of the directory.

1
2
cd ..
vim ch341.sh

then copy the following script into the file.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
#!/bin/bash

KERNEL_VERSION=$(uname -r)
MODINFO=$(modinfo ./ch341/ch341.ko | grep vermagic)
MODULE_VERSION=$(echo $MODINFO | cut -d " " -f 2)


if [ $KERNEL_VERSION != $MODULE_VERSION ]
then
echo "Versions incompatible"
echo ".ko file compiled with " $MODULE_VERSION
echo "System kernel is " $KERNEL_VERSION
else
mkdir -p /lib/modules/$(uname -r)/kernel/drivers/usb/serial/
cp ./ch341/ch341.ko /lib/modules/$(uname -r)/kernel/drivers/usb/serial/
depmod -a
echo "CH341 loaded"
fi

save the file and execute it with

1
bash ch341.sh

The Makefiles and kernel modules can be found on Github.

Comment and share

A key sensor for many robotics applications is camera. It enables cool applications such as object detection, semantic segmentation and visual SLAM. There are two cameras on Qualcomm Robotics RB5.

This tutorial will tell you a few ways to access these cameras. Before we start, note that these cameras cannot be read from OpenCV directly but a tool called GStreamer can bridge the gap.

OpenCV access through GStreamer and tcp

The easiest way to access camera is through a tcp port created by GStreamer. Then you can use OpenCV to read data from the tcp port.

On Qualcomm Robotics RB5, run the following command:

1
gst-launch-1.0 -e qtiqmmfsrc name=qmmf ! video/x-h264,format=NV12, width=1280, height=720,framerate=30/1 ! h264parse config-interval=1 ! mpegtsmux name=muxer ! queue ! tcpserversink port=8900 host=192.168.1.120

Note that you will need to change the host ip to your Qualcomm Robotics RB5’s IP address. This can be done by running the following command.

1
2
sudo apt install net-tools # if you don't have ifconfig
ifconfig

The ip address of Qualcomm Robotics RB5 can be found after inet as something like 192.168.0.xxx.

Then you can access the camera with the help of the OpenCV library. A python example is given below

1
2
3
4
5
6
7
8
9
import cv2

cap = cv2.VideoCapture("tcp://192.168.1.120:8900") #rb5 ip & port (same from command)
while(True):
ret, frame = cap.read()
cv2.imwrite("captured_image_opencv.jpg",frame)
# you can process image by using frame
break
cap.release()

Again, make sure you change the host ip to your Qualcomm Robotics RB5 IP.

accessing camera using ROS or ROS2 packages

Another way is to use the ROS packages we provided both in ROS1 and ROS2.

ROS1: https://github.com/AutonomousVehicleLaboratory/rb5_ros
ROS2: https://github.com/AutonomousVehicleLaboratory/rb5_ros2

We provided launch file for both ROS and ROS2 packages so that it is easy to config the camera node.

Access the Camera in ROS1

For ROS1 package, first clone it to your workspace.

1
2
3
4
5
6
7
8
# If you don't already have a works space, create one first.
mkdir -p rosws/src

# Go to the source folder
cd rosws/src

# Clone the repository
git clone https://github.com/AutonomousVehicleLaboratory/rb5_ros

Then, build your package.

1
2
3
4
5
6
7
8
# Return to the root folder of the workspace.
cd ..

# Include the ros tools, this assumes you have ROS1 melodic installed
source /opt/ros/melodic/setup.bash

# Build only this package
catkin_make --only-pkg-with-deps rb5_vision

Then you can run the package. For example, you can run the RGB camera by

1
2
3
4
5
# source the ros workspace
source devel/setup.bash

# start the program with a set of parameters in the launch file
roslaunch rb5_vision rb_camera_main_ocv.launch

This will publish images to the topic /camera_0.

start a new terminal and check with the following command.

1
2
3
4
source /opt/ros/melodic/setup.bash

rostopic list # list all the topics
rostopic hz /camera_0 # get the frequency of the topic

Finally, you can stop the process by pressing Ctrl + C in the terminal where it is running. Notice that it will take a few seconds for it to stop.

Similarly, you can run the tracking camera by

1
roslaunch rb5_vision rb_camera_side_ocv.launch

And this will publish images to the topic /camera_1.

Access the camera in ROS2

For ROS2 package, first clone it to your workspace.

1
2
3
4
5
6
7
8
# If you don't already have a works space, create one first.
mkdir -p ros2ws/src

# Go to the source folder
cd ros2ws/src

# Clone the repository
git clone https://github.com/AutonomousVehicleLaboratory/rb5_ros2

Then, build your package.

1
2
3
4
5
6
7
8
# Return to the root folder.
cd ..

# Include the ros tools, this assumes you have ROS2 dashing installed
source /opt/ros/dashing/setup.bash

# Build only this package
colcon build --packages-select rb5_ros2_vision

Then you can run the package.

1
2
3
4
5
6
7
8
# Source the ROS2 workspace
source install/setup.bash

# Run the RGB camera
ros2 launch rb5_ros2_vision rb_camera_main_ocv_launch.py

# Or run the tracking camera
ros2 launch rb5_ros2_vision rb_camera_side_ocv_launch.py

Again, you can verify that the message is being published using the following command in another new terminal.

1
2
3
source /opt/ros/dashing/setup.bash
ros2 topic list # check if the topic is being pubilshed
ros2 topic hz /camera_0 # check the frequency of the RGB image message

Reference:
[1]: https://developer.qualcomm.com/comment/18637#comment-18637

Comment and share

  • page 1 of 1
Author's picture

RB5 ROBOTICS TUTORIALS

A set of Robotics Tutorials developed for the RB5 Robotics Development Platform from Qualcomm. Authors are from the Contextual Robotics Institute at UC San Diego.

Contributors

Henrik I. Christensen, David Paz, Henry Zhang, Anirudh Ramesh.