Skip to main content

Running ROS on multiple machines

You can run this tutorial on:

Introduction​

ROS allows you to run nodes on a single robot and on dozens robots as well, as long, as your devices are in the same network.

You will learn here how to configure ROS to work on multiple computers. You will use this configuration to set up system consisting of robot and computer or two computers (gazebo version), which perform task of searching an object.

In this tutorial you will need a computer and a robot with the same equipment as in the previous tutorials.

In case you are working on Gazebo simulator, it is possible to setup system to work on multiple computers. To do so, you need to have two computers with ROS.

Connecting ROS powered robots in LAN​

To run ROS on multiple machines you need to connect them to the same LAN network at first.

Remember that only one devices can run ROS Master. For example if you are connecting robot and laptop, run roscore only on one of those two devices.

While working on multiple machines, you need only one roscore running. Let's assume we are connecting robot and laptop devices, with the following IP addresses:

  • robot: 192.168.1.1 (running roscore - ROS Master)
  • laptop: 192.168.1.2

The steps are as follows:

1st device​

On the robot device open the .bashrc file, then find lines:

export ROS_MASTER_URI=http://master:11311
export ROS_IPV6=on

and replace them with:

export ROS_MASTER_URI=http://192.168.1.1:11311
export ROS_IP=192.168.1.1
#export ROS_MASTER_URI=http://master:11311
#export ROS_IPV6=on

2nd device​

On the laptop device open the .bashrc file, then find lines:

export ROS_MASTER_URI=http://master:11311
export ROS_IPV6=on

and replace them with:

export ROS_MASTER_URI=http://192.168.1.1:11311
export ROS_IP=192.168.1.2
#export ROS_MASTER_URI=http://master:11311
#export ROS_IPV6=on
tip

If you do changes in .bashrc file always source it with . ~/.bashrc or reopen terminal

tip

Remember that roscore must be running on the device indicated as ROS master!!!

Connecting ROS powered robot over the Internet​

As mentioned in the previous section you can run ROS nodes even of dozens of devices as long as they are in the same LAN network. That means you can run a single ROS system containing your laptop and robot connected to the same Wi-Fi router, but you can't if robot is connected over LTE.

You can solve the issue by using a VPN service. Standard VPNs, however, are designed for other purposes than mobile robotics, and are introducting some drawbacks to your system, like:

  • Maintainance of a VPN server
  • Dealing with a high latency (all traffic needs to go throught a VPN server)
  • Long reconfiguration time (a.k.a. switching a Wi-Fi hotspot by your robot)
  • Lack of "ROS awareness"

Fortunately there is a VPN service designed with ROS in mind - Husarnet VPN Client. Husarnet is an open source peer-to-peer VPN - that means you can connect your robots directly over the Internet, with no central VPN server for data exchange. All traffic is encrypted using ephemeral keys, and it works with ROS out of the box. Just install Husarnet Client on your laptop and robot, connect them to the same Husarnet network, by using an online dashboard and enjoy low-latency connection between your devices :)

Using Husarnet is easy:

Step 1 / 5

Create a free account at https://app.husarnet.com

Husarnet Register

At this point you should have your robot and laptop connected to the same Husarnet network with the following Husarnet Hostnames (that can be used instead of Husarnet IPv6 address)

  • robot: my-robot (running roscore - ROS Master)
  • laptop: my-laptop

The steps to configure ROS are as follows:

.bashrc configuration​

On the robot and laptop devices open the .bashrc file, and make the following configuration:

export ROS_MASTER_URI=http://master:11311
export ROS_IPV6=on

In online dashboard make sure to select ROS master checkbox in setting of my-robot device. It will allow you to reach my-robot device by using both my-robot and master hostname (the second one is used in our configuration)

ROS master option in Husarnet online dashboard

As you can see it's even easier than in previous configuration (only LAN network), becasue you don't need to know IP addresses of devices, but only an easy to rembember hostname master.

More about Husarnet VPN​

Husarnet is open source and free to use up to 5 devices. It works not only with ROS, but also with ROS 2, and here you can find a nice article showing how to integrate it with your ROS 2 system:

‌
‌‌‌
‌
‌‌‌

OK, at this stage you should have your laptop and robot connected over LAN or over the Internet. Let's launch some demo nodes to see how the connection works in practise.

Demo 1: see a real-time video stream from RGB-D camera​

tip

A cheatsheet how to run the following ROS nodes / launch files:

  • astra.launch:
roslaunch astra_launch astra.launch
  • rqt_graph:
rosrun rqt_graph rqt_graph
  • image_view:
rosrun image_view image_view image:=/camera/rgb/image_raw

Running on a Physical Robot & a Laptop​

robot part​

  1. Setup .bashrc with configuration for ROS master

  2. Launch roscore

  3. Launch astra.launch

laptop part​

  1. Setup .bashrc with configuration for normal device

  2. Check if connection works with:

rostopic list
  1. run rqt_graph to verify if you can see all nodes from two devices.

  2. Run image_view to see a real-time video from the robot

Running on two laptops (one with Gazebo simulation)​

1st laptop (gazebo)​

  1. Setup .bashrc with configuration for ROS master

  2. launch gazebo simulation:

roslaunch tutorial_pkg tutorial_4.launch use_gazebo:=true teach:=true recognize:=false
  1. Launch astra.launch and image_view nodes

2nd laptop part​

  1. Setup .bashrc with configuration for normal device

  2. Check if connection works with:

rostopic list
  1. run rqt_graph to verify if you can see all nodes from two devices.

  2. Run image_view to see a real-time video from the robot

Demo 2: Outsourcing Robot Computing Power to Laptop​

In this section we will program robot and computer to outsource robot computing power. The node we are going to outsource from a robot is responsible for image processing. So basically a robot publishes camera stream and object recognition is on laptop (using find_object_2d node).

Such a use case could be a common task - especially if we need to use computer's graphics - probably much more powerful than that in a robot.

About demo​

We will follow the pattern form a previous tutorial but this time we do it using two machines (ROSbot and laptop). Anything could be an object to recognize, but remember, that the more edges and contrast colours it has, the easier it will be recognized. A piece of paper with something drawn on it would be enough for this tutorial.

  1. Start roscore, open terminal on your laptop and type roscore

  2. On ROSbot you should run astra.launch image transport and bridge to CORE2.

    For connection to CORE2 we will use package rosbot_ekf

    To launch rosserial communication and Kalman filter for mbed firmware run:

    roslaunch rosbot_ekf all.launch

    For PRO version add parameter:

    roslaunch rosbot_ekf all.launch rosbot_pro:=true
  3. On a laptop we will launch also image transport and find_object_2d node.

Image transport node provides compressed image delivery from one device to another, on sending machine you have to run this node for compression and on receiving machine you have to decompress images. Before we used image view without any compression which is enough just for see if it's working, but if only you want to implement some real time image processing it's necessary to compress an image.

Running a demo on ROSbot (physical) & laptop​

Create launch file under tutorial_pkg/launch and name it as tutorial_5_rosbot.launch:

<launch>

<arg name="rosbot_pro" default="false" />

<arg name="teach" default="true"/>
<arg name="recognize" default="false"/>

<include file="$(find rosbot_ekf)/launch/all.launch">
<arg name="rosbot_pro" value="$(arg rosbot_pro)" />
</include>
<include file="$(find astra_launch)/launch/astra.launch"/>

<node pkg="tf" type="static_transform_publisher" name="camera_publisher" args="-0.03 0 0.18 0 0 0 base_link camera_link 100" />

<node pkg="image_transport" type="republish" name="rgb_compress" args=" raw in:=/camera/rgb/image_raw compressed out:=/rgb_republish"/>

</launch>

And launch file for PC computer, name it `tutorial_5_pc.launch:

<launch>

<arg name="teach" default="true"/>
<arg name="recognize" default="false"/>

<node name="teleop_twist_keyboard" pkg="teleop_twist_keyboard" type="teleop_twist_keyboard.py" output="screen"/>

<node pkg="image_transport" type="republish" name="rgb_decompress" args=" compressed in:=/rgb_republish raw out:=/rgb_raw" >
<param name="compressed/mode" value="color"/>
</node>

<node pkg="find_object_2d" type="find_object_2d" name="find_object_2d">
<remap from="image" to="/rgb_raw"/>
<param name="gui" value="$(arg teach)"/>
<param if="$(arg recognize)" name="objects_path" value="$(find tutorial_pkg)/image_rec/"/>
</node>

</launch>

Use rqt_graph to see how the system is working now.

That's all you don't have to worry about nothing more than properly configuration of .bashrc rest stays the same. If you have more than 2 devices the pattern is the same. ROS takes care of all data exchange. That is really simple, isn't it?

Running a demo on ROSbot (Gazebo) & laptop​

Gazebo node will be running on one computer (let's name it "gazebo computer"). Second machine - name it laptop - will be responsible for image processing.

At "gazebo computer" use the following launch file and save it as tutorial_5_gazebo_1.launch:

<launch>

<arg name="teach" default="false"/>
<arg name="recognize" default="true"/>

<arg if="$(arg teach)" name="chosen_world" value="rosbot_world_teaching"/>
<arg if="$(arg recognize)" name="chosen_world" value="rosbot_world_recognition"/>

<include file="$(find rosbot_gazebo)/launch/$(arg chosen_world).launch"/>

<include file="$(find rosbot_description)/launch/rosbot_gazebo.launch"/>

<node pkg="image_transport" type="republish" name="rgb_compress" args=" raw in:=/camera/rgb/image_raw compressed out:=/rgb_republish"/>

</launch>

On laptop use this launch file and save it as tutorial_5_gazebo_2.launch:


<launch>
<arg name="teach" default="true"/>
<arg name="recognize" default="false"/>

<node name="teleop_twist_keyboard" pkg="teleop_twist_keyboard" type="teleop_twist_keyboard.py" output="screen"/>

<node pkg="image_transport" type="republish" name="rgb_decompress" args=" compressed in:=/rgb_republish raw out:=/rgb_raw" >
<param name="compressed/mode" value="color"/>
</node>

<node pkg="find_object_2d" type="find_object_2d" name="find_object_2d">
<remap from="image" to="/rgb_raw"/>
<param name="gui" value="$(arg teach)"/>
<param if="$(arg recognize)" name="objects_path" value="$(find tutorial_pkg)/image_rec/"/>
</node>

</launch>

Execute both launch files, open rqt_graph to see how the system is working now.

Summary​

After completing this tutorial you should be able to configure your CORE2 devices to work together and exchange data with each other. You should also know how to program robots for performing tasks in cooperation.


by Łukasz Mitka, Adam Krawczyk and Dominik Nowak, Husarion

Do you need any support with completing this tutorial or have any difficulties with software or hardware? Feel free to describe your thoughts on our community forum: https://community.husarion.com/ or to contact with our support: support@husarion.com