Tuesday 15 November 2011

ECOLOGY BASED ROBOTICS - A SNEAK PEEK

I come home, I find the automatic lawn mower mowing my lawn, as it is supposed to do when the grass grows more than 1.5 inches. At the door the face recognition system detects it is me and opens the door. My personal robot comes along and says "Good evening, tea and cookies will it be? " to which I smile and acknowledge. I enter my drawing room and the air conditioner realises my presence and starts cooling at 22 Degree Centigrade - as per my preference. Not long my personal robot gets me my cup of tea and cookies. Serves me and politely adds, "Just to remind, you have dinner with Mr.Smith, given the traffic and the distance a good time to start would be 7:33 PM". 
This may not be too long into the future.  A world which is dominated by automation and robotics may be just a couple of decades away. Since the 80s, researchers have realised the importance of applied AI, and away from the jargon laden ivory towers, AI has made its way to robots and intelligent machines which promise to bring the Clarkian world of science fiction to life. 
Fig.1 Rosey the Robot Maid, from The Jetsons - An example of Personal Robot

 ISSUES WITH BEHAVIOUR BASED APPROACHES

Brooks and Arkin enunciated the behaviour based approach, giving the robot its own sensory system so it can detect and respond to the environment coupled with a hierarchy of control laws which work in tandem. Thus if the higher level of control fails then the robot can 'subsume' to a lower level of the hierarchy thus preventing complete system failure. Adding on to this was motivations from anthropomorphism from animals and inscets, such has always attracted the enthusiasm of roboticists. Behaviour based approaches led to a blending of behaviours, enabling the cumulative reactive response of the robot as an emergent notion. This hunger for doing 'God like' and creating intelligence which reacts to external stimuli with concerted mechanical response; has been an ongoing effort for the last 3 decades.

An obvious problem to the Brooksian philosophy, which Brooks acknowledges to some extent in his paper (1991); is that the environment as perceived by the robot, is what it appears to its sensors. Thus, for a low lying mobile robot (viz. Roomba) which has sensors with an angular span of 30 degrees, will see a chair as 4 metallic rods sticking out of the floor and  also will have more appreciation of a 2D perception than a true 3D perception. Also, a  sensor will have a finite range - so any world view will be an incremental endeavour, probably very slowly at times - this time lag may impair the robot's reactions.  Maps may help till some extent - but real world is dynamic and thus not really 'mappable'.

Though, behaviour based approaches denounce analytical modeling, (viz. the  block world etc) however, sensor based approaches clearly have their issues. As Brooks puts it, with a hint of sarcasm; 
When we examine very simple level  intelligence  we find  that explicit  representations and models  of the world simply get in the way.  It  turns out  to be better to use the world as its own model.
I AND THE WORLD - THE ECOLOGICAL APPROACH

The idea to model the world and the agent as a single unit is probably most appealing to a software designer, such was discussed by Saffiotti (1998) in his doctoral research. 

The philosophy that the world embodies the agent in itself, this ubique point of view is probably the starting point of Ecology based robotics, extending this idea leads to more potent conceptions; a number of interacting robots and devices all of which work in tandem and in explicit cooperation. The very idea being motivated from the concept of Biological notion of Ecology - where each creature's doing has a bearing on every other creature in the Ecology.

This approach is said to be the third revolution in robotics, the first being the industrial robot while second being mobile robots & personal robots.

One of the earliest proponent of Ecology based robotics was Duchon  (1994), his work was an extension of Gibson's pioneering work, 'Ecological Approach to Visual Perception'. Duchon points to some basic principles; 
  1. Because of their inseparability, the agent and the environment together are treated as a system.
  2. The agent's behaviour emerges out of the dynamics of this system.
  3. Based on the direct relationship between perception and action, the task of the agent is to map available information to the control parameters at its disposal to achieve a desired state of the system.
  4. The environment provides enough information to make adaptive behaviour possible.
  5. Because the agent is in the environment, the environment need not be in the agent. That is, no central model is needed, but this does leave room for task-specific memory and learning.  
However, Duchon's works were limited to  Visual Perception and also being a research in the 90s, it lacked the new age technologies which came to fore in the next two decades. More in tune with current day; Arkin addresses Ecology (viz. Ecological Psychology) throughout his works while Saffiotti and his team has developed control software, PEIS (Physically Embedded Intelligent Systems) to implement Ecology based robotics.
 
Fig.2 Laying of the 'PEIS floor', the floor is networked using RFID chips


Fig.3 The Pedagogy leading to an ubiquitous point of view, modified from the works of Saffiotti and Broxvall

Thus, to progress into the realm of 'i-robot', 'Rosey the Robot Maid' and R2D2; a pragmatic society in which robots and automation work in tandem to support the human civilisation - we probably need to route it via Ecological Approaches. 

All philosophies have short comings; as a criticism to Ecology based robotics, it can never be realised for sufficiently large environments - all sensors will have a physical limitation of range - a central model of some sort will always be needed to bridge this gap between the pristine theory and the practical applications.

Tuesday 8 November 2011

UPCOMING POSTS

Six upcoming article I have planned for the blog for the next three month;

'3 generations - shakey, flakey and erratic' - A brief discussion on the Saphira Architecture which has been responsible for these three very special mobile robots.

'Uncanny indeed' - A discussion on the uncanny valley hypothesis by Masahiro Mori

'Ecology based robotics' - A peek into Ecology based robotics

'GSSP' - Tutorial and discussion on Graphical State Space Programming

'Behave !' - A study of contrast - on the various definitions of 'behaviour' in mobile robotics


ROS@HacDC - It may be fun to review HacDC Robotics Class 2011 

MAPS FROM STAGE SIMULATIONS IN ROS : PART DEUX

SLAM USING TELEOPERATION IN STAGEROS

In my previous article, I discussed realising SLAM using stage's wander controller. Now, I discuss about an alternative method where the robot can be driven around by human participation and thus realising SLAM.

Such can be done in ROS using stage/stageros and teleoperation.

#.1 - Start an instance of the master in a terminal window

roscore
 
#.2 - Start the stage/stageros simulation in a new terminal window 

rosrun stage stageros /path/to/.world file

This world file must not employ the wander controller. The idea is to drive the robot around not let it wander.


rosrun teleop_base teleop_base_keyboard base_controller/command:=cmd_vel 


#.4 - Start gmapping in a new terminal window 

rosrun gmapping slam_gmapping scan:=base_scan

#.5 - Start map server in a new terminal window 

rosrun map_server map_saver

The map will get saved in the directory from where the command for map_server is issued as a pgm file.

Drive the robot around using u,i,k,l etc and check the map as it starts to develop. Drive it around a few times across the whole environment that should give a sufficiently good map. 


The rxgraph for this simulation is as shown;

The pros and cons with the previous method can be seen in contrast;

NOTE : Simulations done in ROS 1.4.10 diamondback