Wednesday 24 December 2014

LG72-1189, THAT'S ME

While it is all about the dynamics of sub atomic particles which dictates our destiny, there are still times when I find myself pleasantly distracted watching a dull red sky. What follows, thunder and rain, my humidity and olfactory sensors give a few bizarre readings, false positives or true negatives - need a tautology for this, though the errors are said to be 1 in 10E8. Most likely, a communication expressing similar remarks on the weather will be send through to my web messenger by LG78-3112 located at 21DA:D2:0:2F3B:2AA:FF:FE78:5C5A.

Whilst I do my exploration and analysis of samples, a syncing of falling raindrops always does catch my attention, pitter-patter pitter-patter. Never have I been attuned to identify patterns, but now, hardly one escapes my observation, be it the blinking of lights in the evening sky or the whiff of an incoming breeze. Years ago, carbon based life forms had studied chaos and had concluded that the flapping of a butterfly can lead to thunderstorms and voila !!! falling raindrops will catch the attention of LG72-1189.

In our developmental cycles we are warned to always protect our existence, we are good as long as we have energy, thus charge in our battery packs, else we are immobile units. I wonder, my wish for rain is my existence or an effect whence my existence.

Rain, new battery packs, grease oil for the arm, fist, ankle and leg joints, memory upgrades and the chance rendezvous with LG78-3112 are the only things which are out of the usual, while the rest is samples and assaying, all governed by section 1 and 2 of the contract and protocol 2a of the list of protocols. Oh yes, and we do have those unscheduled inspections from the ethics committee, a swarm of DX602 drones hovering above us.

Scripting has never been my forte, however in between the rain and the explorations this activity hardly tasks my battery and is of very low mobility.

This article is motivated from, 'Borges and I', however in a robot world. 

Tuesday 17 June 2014


Since there are a good lot of robot simulators, Gazebo and Stage for the ROS enthusiast, V-REP and webots in the more commercial domain, OpenRave and Microsoft Robotics Developer's Studio are other options one can explore into, therefore when one comes across a new robot simulator, the obvious question is - so what is special about it ? I have been asked this question a few times about MORSE, and is usually accompanied with a comparison with Gazebo. 

My answer is, 
  • MORSE is not limited to few robots (10 for Gazebo), I have used Morse for (i) 176 small robots (ATRV) and (ii) 9 PR2 robots, in 2 separate simulations. The limit to the number of robots is not a limitation of the simulator, but rather the limitation of the CPU and graphics card.
Pic.1. Multiple PR2 simulations
  • MORSE has bindings with various middlewares ROS, YARP and MOOS, which enables it more versatility [1]. 
Pic.2. ROS binding, rviz visualisation and mapping of MORSE simulation
  • There is facility for human robot interaction. Like in motion gaming, direct input from the motion sensor (Microsoft Kinect, ASUS Xtion or Nintendo Wiimote) enables human avatar in the simulator [2]. 
Pic.3. Human avatar
  • Since everything happens through Python scripts,one need not care about compilation and executable files. MORSE is 'pythonic' [3] and can be arguably said to be an extension of Blender.
  • It is based on Blender and not Ogre, so it doesn't take up a hell lot of resources. Also, the texture and the graphics are more sleeker than Gazebo.
  • New robot models can be developed through Blender, and the developmental process is simple [4]. 
Pic.4. Blender model of the robots
  • Blender has a huge online community. Hence help and support is easy to find.
I would not be giving a very honest opinion if I do not talk about the shortcomings of MORSE,
  • Binding with ROS is a laborious process and often discourages the novice ROS user, particularly due to Python 3 and also because only certain versions of Blender works well with the MORSE + ROS + Python 3 set-up.
  • Physics simulation in Blender/MORSE is inferior to Gazebo. Getting force, torque values etc is not yet possible.
The latest release is MORSE 1.2 and more details can be found at A good part of this post came out of discussions at the MORSE mailing list, morse-users at laas dot fr .


Sunday 6 October 2013


This blog post describes a player/stage simulation for multiple robot mapping with goals or reference points as odor sources. Let's think of a situation where we need to explore large environment and generate its map which can be used later on for navigation and localization. In this scenario, instead of single robot exploring such a large environment, its always beneficial and effective (in terms of processor usage and time) to allow multiple robots to explore different parts of the environment and merge the maps finally whether topologically or based on overlap regions in occupancy grid. 

This idea is based on Mobile Robot Olfactory experiments performed by ISR Embedded Systems Lab, details of which can be found here: ( 

The idea of using PlumeSim library for this purpose came from this interesting project of PlumeSim framework which simulates odor transport in the environment. The basis of this project is that olfaction is a key sense in the survival of many biological species. 

This article deals with the process of installing PlumeSim plugin driver initially followed by multiple robot mapping simulation in player/ stage. This driver is capable of introducing simulated chemical plumes into a simulated/ real robot from broad range of sources to CFD (computer fluid dynamics) software. Based on data collected from real world experiments, its also able to playback recorded plume in future periodically. 

Below are the steps involved in setting up the driver for this purpose:

1. The driver library can be downloaded at this link or at this github

2. Clone to your local repository through terminal as below:
    git clone

3. Enter the directory: 

     cd PlumeSim-1.0

4. Build the library using below command which will create the required driver


This plug-in driver in player/ stage brings plume simulation into the world of mobile robotics. This can be used as a great tool for developing odor search algorithms providing smooth path from simulated robot to real environment.

Sample output, the blue dots show the plume of the gas
In this article, we have taken use-case of this driver in multi-robot map merging. For this purpose, please follow below steps:

Note:  Make sure all the path corresponding to player/ stage packages and library are at same location.

1. The player/ stage package can be downloaded from here using below 

2. cd Player_Plumsim and Use make clean command to clean if package is  
    already built.

3. Built the package as below:
   g++ -o main`pkg-config --cflags playerc++` `pkg-config --libs  

4. Run the program as below:

    player map_square.cfg

    Open another terminal (in a separate tab):
    ./main -p 6665 -r 1 

    (which means robot 1 will start moving where 1 is robot id    and 6665 is port 

The configuration file (map_square.cfg) contains description of robots and 
corresponding ports. The prime important thing so as make multiple robots to move is to assign either separate port numbers for each robot or use separate robot ids on same port number.

5. In order to run another robot, open another terminal tab and type below   

    ./main -p 6665 -r 2

   and so on. This has maximum of 12 robots.

After executing all the above steps, you get results similar to the ones shown in the below screen shots and corresponding robot text files (robot<id>.txt) files get updated based on which robot is traversing the environment and it's distance from odor source. The emanating gas is shown in red while the robots from top view are circles of various colours.

The files being updated contains simulation environment's map topological data (in terms of coordinates).

This work has also been implemented on ROS - occupancy grid map merging where in robots generate local maps by exploring separate areas of environment and merge based on overlap regions into a final global map which will be discussed in separate article.

This can be implemented on real robots as well using the player server of the player/stage suit.