Category: Robotics

BEAR (Battlefield Extraction-Assist Robot) is a prototype from Vecna Robotics [1] designed to find, pick up and rescue people in harm’s way. The BEAR can lift heavy loads (up to 135 Kg) and carry them long distance. It is intended to carry out rescue tasks where humans cannot access without risking additional human life.

BEAR is not an autonomous robot, but wireless control by a human operator. However, Vecna will implement autonomous beahviors in order to make it easier to control. Currently it is in a proof-of-concept development stage. The prototype has demosntrated picking up a fully-weighted human dummy.

Former prototypes were based on a two-wheels Segway automatic balance system (as can be seen in the first video below). However, current prototype (as shown in the picture and the second video below) is based on a set of tracks. Using this system, the robot can travel in a low profile ‘centaur’ posture and even kneeling.

The project is scheduled to finish in 5 years. Then this robot will be ready to be used in the field by the US Army Telemedicine and Advanced Technology Research Center [2].

Do robots dream of God?

This is a controversial question raised many times in Hollywood movies. However, in recent years scientists and theologians are looking at the possible implications of Strong Artificial Intelligence.

The possibility of a future in which humans and robots live together in a new technological society is bringing together science and religion. Anne Foerst, a theologian and research scientist seeks to bridge the gap between religion and AI research. In her book God in the Machine [1], she argues that robots have much to teach us about ourselves and our relationship with God. Back in 1993 Foerst was working with Rodney Brook’s team at the MIT AI Lab. They identified a set of questions that are intrinsically related with the two fields of robotics and religion: Can a robot be human? What does it mean to be human? Are we made in the image of God?

MIT’s Cog robot was designed to learn from physical and social interaction. It was programmed to show social and emotional responses. According to Foerst, personhood is just playing a role in a mutual process of telling stories. Even though robots like Cog can’t tell their own story, they can at least play a role in our lives so we can include him in our narrative process [2].

In her book, Foerst establishes a relationship between robot building and God. In her opinion, when we try to build humanoid robots in our image, we realize the complexity of humans and our admiration for God’s creation grows [3]. The goal of building (conscious) robots is trying to find out who we work, learning more about who we are, and what makes us human, says Foerst [the conscious adjective is mine]. From her perspective, ancient golem builders can be seen as the ancestors of contemporary AI researchers.

Another interesting claim made by Foerst is that building machines that could think doesn’t mean building human-like machines. It is not the individual thinker, but the person that form community with other people what makes us human. Indeed humans are social animals, and I would say that consciousness is very much related with social behavior. However, Foerst doesn’t mention the term consciousness in this interview [3].

If we are able to build robots with social intelligence, able to participate and build our society, then we will have to face lots of new problems. See The safe performance of robots for a discussion on the matter.

It was a matter of time that Microsoft Robotics Studio had a competitor. Last month Gostai released URBI 1.0 RC2. Release candidate 1 was out just the same month (December 2006) Microsoft released their version 1.0 of MSRS. URBI is available under GPL license. (MSRS is also freely available under a Non-Commercial license).

URBI is very in the same line where MSRS is. The key idea of these sorts of products is to offer a general programming framework for robotics. And this is quite new. We were used to have software simulators, programming libraries for concrete robots or robot families. The great advantage of a product like this, is that you can focus on you robotic application without bothering too much about concurrency, asynchronous I/O, and distributed processing. Additionally, a robot controller written using these new tools should be easily portable from one platform to another (given analogous sensors and actuators).

URBI is based on a script language which offers new features like parallelism, event-driven programming, and distributed object management. Multiple programming languages can be used as the URBU functionality is enclosed in the liburbi library, which can be integrated languages like C++, Java, Matlab, Python, etc.

URBI allows you to import C++ objects and even use them remotely in a transparent way. Remote objects can be executed in Windows, Linux or Mac OSX. This is called the UObject architecture.

Dealing with parallelism and events is quite common and challenging in robotics. URBI integrates the management of these programming aspects in its language. Commands like ‘whenever’ simplify the task of parallel programming. For instance:

Whenever (ball.visible)
{
Camera.trackPosition( ball );
}

Other advances features have been included in the semantics of URBI as abstractions for parallel programming. For instance, conflicting simultaneous operations can be handled as specified by the programmer.

One key aspect of a platform like this is the availability of code and the contributions of a community. Urbiforge (http://www.urbiforge.com/) is the site where this software and related contributions can be downloaded. Also, tutorials and forums are available in the same site. Interesting libraries include scripts for Aibo, Lego NXT, etc.

Robots and Human Safety

It seems that some governments are taking very seriously the possibility of the everyday use of robots in society. Japan and South Korea are worried about human safety in a world where many critical tasks can be performed by machines. They share the Issac Asimov’s vision of a future world where human beings and robots coexist. However, they don’t seem to trust mechanical creatures controlled by only three simple laws.

As reported by The Times a few weeks ago, Japanese robotics experts assure that the famous Three Laws are not enough to keep us safe when the next generation robots become a reality. A 60 page draft document titled ‘Guidelines to Secure the Safe Performance of Next Generation Robots’ is being discussed by the industry, researchers, and lawyers with the aim to elaborate a law that protect us effectively.

This draft document proposes the creation of a central database where all incidents of human harmed by robots will be recorded and accessible by robot manufacturers. Therefore, robots must be equipped with the corresponding mechanisms to log and communicate any injures they cause to people during their task accomplishment. Japan is envisioning a near future where robots play a key role in society and they have detected the need for a well-built regulation. It is foreseen that the domestic robot market grow in Japan up to more than 3.3 trillion yen in the next 15 years. Assistant robots able to help and chat with pensioners are already a reality. Nursing robots, security patrol robots, home assistant robots are going to be common in the coming years.

South Korea is also working in an ethical code for robots. The designed committee is to establish a code of conduct for the life of intelligent robots. The Asimov’s Three Laws are also seen insufficient by South Korea. Identifying robot units, preventing their illegal use, and assuring data privacy are other aspects that need to be taken into account.

It seems that the dawn of the age of robots is coming as predicted by Bill Gates [1], however it shouldn’t be seen as a dangerous or problematic issue as it won’t be as shocking and fast as predicted. In fact, it is already happening for many years. Don’t you think so? Look at these figures:

Forecast for 2006-2009: It is projected that sales of all types of domestic robots (vacuum cleaning, lawn-mowing, window cleaning and other types) in the period 2006-2009 could reach some 3.9 million units.
The market for entertainment and leisure robots, which includes toy robots, is forecast at about 1.6 million units, most of which, of course, are very low cost. [4].

The Pioneer 3 DX is a robot base platform by MobileRobots Inc. (ActivMedia Robotics). This wheeled robot has been updated in its P3-DX8 version to carry loads more robustly and improve autonomy.

Although the P3-DX offers an embedded computer option it can be equipped with an onboard regular laptop computer thanks to the 23 Kg. allowed load. Obviously, embedded computer is a more robust and elegant solution, however it is more expensive as well. The main drawback of using an onboard laptop computer is the loss of the load surface where other sensor or actuator options could be installed (see image below).

As other MobileRobots robots, the P3DX8 is based on a core client-server model which provides a set of libraries and utilities for intelligent applications (the robots act as the servers). I will not get deeper in detail about this client-server architecture as my focus is using the Microsoft Robotics Studio to control these robots. We will visit this question again when we talk about software development for this robotic platform.

Salient hardware features of this unit are: ethernet based communications (optional), laser (optional), up to 252 watt-hour of hot-swappable batteries (which by the way add a great part ot the total unit weight), ring of 8 forward sonar, ring of 8 rear sonar (optional), two independent motors, two 19 cms. wheels and one caster wheel. Maximum speed is 1.6 m/s. Other interesting options are bumpers, grippers, vision, stereo rangefinders, compass, etc.

{mosimage} New Pioneer 3 Robots have a 32-bit Renesas SH2-7144 RISC microprocessor, including the P3-SH microcontroller with ARCOS. ARCOS is the Advanced Robot Control and Operations Software client-server interface. If you want to develop your own control software application you need to talk to ARCOS interface. I am not particularly interested in using a specific interface/platform like ARCOS but better use a common development platform like MSRS. Of course, ARCOS will be always called by MSRS services, so at the end, robot control is calling the same undelying native API.

Another higher level software component that comes with every MobileRobots platform is the Advanced Robotics Interface for Applications (ARIA) software. ARIA is a C++ based development environment that also provides TCP/IP communications with the robotic platform. Typical applications available through ARIA are: mapping, teleoperation, monitoring, etc. As I said before, I won’t discuss further on ARCOS and ARIA as my focus is on MSRS.

The following video shows a P3-DX wandering around autonomously avoiding collisions:

I-Cybie Review: I-Cybie robotic pet is manufactured by Silverlit Electronics and commercialized worldwide by several toy vendors. It comes in three colors: blue, gold and transparent cover. This pet robot is quite close to a Sony Aibo in terms of form factor, usage and features. The robotic dog is made of 1400 parts and 90 feet of wire. It has an onboard computer and responds to its environment with canine-like moods. For example, he becomes sad if you ignore him… This robot is endowed with voice recognition, behavior development, sound sensors, obstacle detection, dynamic drive, orientation sensors, touch pet sensors, and communication with I-Cybie is possible also using an IR remote, clap commands, and voice commands.

I-Cybie is able to wander arround without trouble. He can detect obstacle and avoid them with no problem. Even is you make he fall down, he can recover gracely. His sensors detect his position, so he knows when and how to move to stand up again.

The robot has four main emotional states: happy, hyper, sad, and sleepy. Depending on the mood the dog shows a different behavior and decides to obey or not… The emotional state of the dog is caused by your interaction with the pet and the environment. For example, a quite and dark room can make I-Cybie to feel sad.

I-Cybies can also do some tricks for your entertainment and amusement. For example, he can give paw, dance, or scracth ear. It also can be trained to recognise you voice, so you can tell him to do a trick if you want.

A walk-up charger is also available. If you have it, I-Cybie is able to autonomously look for the wall charger when he feels tired (battery low) and re-charge by himself.

Programming the robot is also possible. However, a hardware hack is required in order to be able to program I-Cybie using C language. A so-called Super I-Cybie upgrade is needed. An SDK for development is also available. See the following links for details on programming and SDK:

Scientific American January 2007 issue features an article by Bill Gates titled A Robot in Every Home [1]. Is domestic robotics industry going to reach critical mass in the short term? Robotics applications in manufacturing are a reality. However, practical application of robotics in the residential market is another story. In his article, Bill Gates talks about the challenges of this domain, and remarks the need for a standard framework (although he doesn’t mention it initially, obviously he is referring to the newly released Microsoft Robotics Studio).

Gate’s vision of robotics is based on an evolution of the PC. From personal computers in every home, to personal robots in every home. It is like endowing the current PCs with the features of typical science fiction robots. But, is this likely to happen in the short term? Is Microsoft powerful enough to drive such a change in the market? Do we actually have the required technology? I wouldn’t answer these questions yet, but I’d say that the time of NS-5 type robots hasn’t come yet.

Polymorphic Robotics, self-adaptive, self-organizing, and generally self-* properties of robotics are usually related to the field of machine consciousness. Basically, some degree of self-consciousness is required to self-repair or self-heal a robot. The SASO 2007 Conference covers this discipline (First IEEE International Conference on Self-Adaptive and Self-Organizing Systems. Boston, Mass., USA, July 9-11, 2007) [1].

Related topics are: self-organization, self-adaptiveness, self-management, self-monitoring, self-tuning, self-repair, self-configuration. There is no doubt that integrating these kinds of techniques with higher cognitive models requires a machine consciousness like model.

Some examples of polymorphic robots are available from the Polymorphic Robotics Lab at University of Southern California [2].

Conscious-Robots.com is an Internet portal dedicated to the scientific research in Machine Consciousness. This field of artificial intelligence is very much related to cognitive robotics, and the following terms are often used as synonyms: Artificial Consciousness, Synthetic Consciousness and Robot Consciousness. Although much more detailed definitions can be found in these web pages, we could briefly define Machine Consciousness as the research on producing consciousness in an artificial device (like a robot) using engineering techniques. Understanding human consciousness is a great challenge, hence Machine Consciousness problem is even harder. There is no doubt that these problems have to be addressed from multiple disciplines. This site aim to follow this multidisciplinary approach including information and resources from many fields, from philosophy to genetic programming.

Conscious-Robots.com offers the following content:

Resources, publications and information about Machine Consciousness research and related AI techniques.

Latest news and reviews about conscious software, conscious machines and robots used for research in machine consciosness.