0%

512 Getting Started with Virtual GoPiGo3

Check the kinematic behaviour using the 3D visualizer rviz

In the previous article ROS programming from the beginning we introduced the basic concepts so that you can now have a quick tour of how to simulate robots in ROS.

image25

For such goal we built a realistic 3D model of GoPiGo3 Starter Kit, that also includes a Pi camera, an IMU sensor and a X4 YDlidar laser sensor. So, at this moment you only need to be aware of the kind of sensors that are on-board what is the purpose of each one:

  • The distance sensor (in red color in the picture above, part of the Starter Kit) measures the distance to the first object in front using a single laser beam.
  • The Inertial Measurement Unit (IMU) sensor (cyan color) contains an accelerometer, a gyroscope and a magnetometer, whose combined measurements allow for the detection of motion, orientation and position of the robot.
  • The Pi camera is a 8 megapixel resolution 2D camera able to provide video modes 1080p30 and 720p60.
  • The X4 YDlidar laser sensor provides a 360º scan angle to detect any obstacle around the robot. The scan frequency is 6–12 Hz, detecting objects up to 10 m far away.
    For the practical exercise we are going to cover, you only need a common laptop with Ubuntu 18.04 with ROS Melodic installed. If you still work with Ubuntu 16.04, you should install ROS Kinetic instead.

Software installation

Clone the source code into your ROS workspace. Replace catkin_ws with the name of your actual workspace:

1
2
$ cd ~/catkin_ws/src
$ git clone https://github.com/ros-gopigo3/gopigo3

This repository contains a collection of ROS packages specifically developed for GoPiGo3. It allows the user to control the virtual robot, as well as the physical robot.
In this first article, we will focus on the simulated robot, hence only two packages of the collection will be needed. Easier, right?

In addition, the repository includes the key_teleop ROS package, part of the basic remote controller bundle known as teleop_tools. This way, you will be able to control the virtual robot with the keyboard.
After cloning the source code, build the ROS workspace and you will be ready to start the simulation:

1
2
3
$ cd ~/catkin_ws
$ catkin_make
$ source ~/catkin_ws/devel/setup.bash

In the following two sections we first describe the gopigo3_description package, that contains the 3D definition of GoPiGo3 as well as its mechanical properties. After that, in the second and last section, we use the other package, gopigo3_fake, to run the kinematic simulation.

Robot 3D model

The package gopigo3_description contains the URDF description of GoPiGo3 that you can use to perform simulations of the robot. The acronym URDF stands for Unified Robot Description Format, and is the standard defined by ROS to describe any robot using XML format. In particular, it allows to define the visual, collision and inertial properties of GoPiGo3.

The following command allows to inspect the model of the robot with the 3D visualizer rviz:
$ roslaunch gopigo3_description gopigo3_rviz.launch

You can set the gui parameter to launch a widget that lets you rotate both wheels with the sliders:
$ roslaunch gopigo3_description gopigo3_rviz.launch gui:=true

The figure below shows what the rviz window should look like:
image26

In the right-bottom widget there are the two sliders that you can use to rotate the wheels from -180º to 180º (angles are shown in radians, ranging -3.14 to 3.14 rad).
You can appreciate in the image above that every part of the robot has its own frame of reference. The color definition is universal:

  • X positive axis is red,
  • Y positive axis is green, and
  • Z positive axis is blue.

Once you get familiar with the robot model, you will be ready to teleoperate it.

Kinematic simulation

Kinematic refers to the fact that, in this first kind of simulation, we are only going to deal with pure motion of the robot, i.e. position, velocity and acceleration. This means that you can command velocities and accelerations as high as you want, even though the actual robot did not have power to achieve them!

Hence masses, inertia and forces are not taken into account at this level. This is left for the dynamic simulation that we will cover in a later article.

The package gopigo3_fake allows to experiment with GoPiGo3 without needing the physical one. It reproduces its kinematics according to its relevant chacteristics: distance between wheels and their radius.

To run the kinematic simulation, first launch the virtual robot:
$ roslaunch gopigo3_fake gopigo3_fake.launch

To make sure that teleoperation is enabled, check in a bash terminal that cmd_vel topic is being published:

1
2
$ rostopic list | grep cmd_vel
/cmd_vel

If there is the output as above, then GoPiGo3 is ready to accept motion commands. You can control it with the keyboard launching this command in another terminal:
$ rosrun key_teleop key_teleop.py /key_vel:=/cmd_vel

After some strokes to the arrow keys, the rviz window should show something similar to the following screenshot:
image27

The red arrow represents the set of poses (position + orientation) that the robot has covered. Hence, the trajectory is the line joining all the positions (starting point of every arrow) tangent to every arrow.

In a later article you will find that this teleoperation works in the same manner for the physical GoPiGo3.