But another thing that surprised us was the company's emphasis on software. Rethink doesn't want to be just a robot maker. It wants Baxter to be a platform that anyone can use to improve on existing applications as well as develop completely new ones. To achieve that, Rethink needs to open up its technology, and last week the company announced a major step in that direction: a version of Baxter designed for researchers.

Sure, Baxter's primary target is still industrial applications. But Rethink knows that, by enlisting the help of the robotics community to push the envelope of what its robot can do, it would have a major advantage over other robots. When Evan and I asked Brooks what he expected other roboticists would do with Baxter, he said he had no idea, and that was a good thing. Brooks recently said he hopes researchers and others will "use their creativity and programming skills to create never before seen applications."

So first let's see what Rethink is offering with its Baxter Research Robot. The hardware is exactly the same as the previous Baxter model. The robot has two 7-DOF arms, powered by series elastic actuators. It has integrated cameras, sonar, and torque sensors on every joint. The base price is also the same: US $22,000. So what's different? Software. The research model does't come with the manufacturing software installed, but rather runs the research software development kit (SDK).

The Baxter research version is still running a core software system that is proprietary, not open. But on top of that the company built the SDK layer, based on ROS (Robot Operation System), and this layer is open source. In addition, there are also some libraries of low level tasks (such as joint control and positioning) that Rethink made open.

It's an interesting approach. Rethink is creating something that is not completely open source but not totally closed either. Contrast that to traditional robots, such as those offered by iRobot (cofounded by Brooks), which are closed systems: iRobot's CEO Colin Angle, in fact, argues that open software is not helping robotics from a business point of view. Rethink is taking a different route: It's betting that the best way to succeed is getting others to "hack" Baxter.

And the hacking has already started. To find out more about their plans for Baxter, we reached out to researchers at Worcester Polytechnic Institute, MIT, and Tufts, which already have the robot. Here's what they told us:

We have a Robotics Engineering Advisory Board, including Rethink Robotics VP of manufacturing and operations Jim Daly. Jim and Rod Brooks gave us a sneak preview of Baxter in September 2012 ahead of public release, and expressed interest in having WPI use it for research.

Specifically, WPI plans to use Baxter to modernize an existing industrial robotics course that introduces students to robotics within manufacturing systems. In that course, Baxter will be incorporated as an example of a current manufacturing robotic technology. Additionally, Baxter will be used as an enabling technology for future proposals, including NSF’s National Robotics Initiative and Catalyzing Advances in Undergraduate STEM Education program. Importantly, Baxter will also be used by undergraduate and graduate students in the research on understanding novel grippers, studying human-robot interaction, incorporating touch sensing, and making Baxter mobile.

And why Baxter? We think the robot is particularly well-suited to research through the Research SDK, which is open-source and based on ROS. Our undergraduate and graduate students learn ROS in their classes and projects, and they are able to quickly develop software for Baxter. Baxter will be housed in the Autonomous Robotic Collaboration (ARC) Lab, next to its new friend, "Archie" the Willow Garage PR2. By the way, some WPI students call the robot "Baxtette."

The robot will be overseen by Dmitry Berenson, assistant professor of Computer Science at WPI. Professor Berenson and his team plan to work toward putting a compliant robotic hand on the robot to study tasks that can be accomplished with limited sensing. What Baxter really does, though, is open up research avenues that enable humans and robots to cooperate in manufacturing tasks. So not only is Baxter useful for manufacturing, it is a powerful and inexpensive research tool that opens the door to some very exciting research possibilities.

John J. Leonard, professor of mechanical and ocean engineering at MIT, head of MIT's Marine Robotics Group, and member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL):

We "recruited" Baxter to assist us with, among other things, 3D object recognition data collection. The idea is to get the Baxter Research Robot to hold an Asus RGB-D sensor (Kinect-like camera) and scan it over a person or object to create a 3D model. We're using software (spatially extended Kinect Fusion) developed by Tom Whelan from the National University of Ireland, Maynooth.

We plan to use the robot to scan many, many objects and people, using an adaptively tuned motion strategy, allowing us to build up a library of objects and people. We will also work on having the robot manipulate objects, and to try to "learn" to recognize different objects over time. The robot saves us from the tediousness of repeated scanning objects, creating a "scanning station" that we can use for visitors/volunteers and allowing graduate student to focus on their research.The video below shows a quick test we put together to show a person and object (bicycle) scan; future scans will be more extensive and have higher resolution.

William C. Messner, professor of mechanical engineering and chair of the department of mechanical engineering at Tufts University:

The primary reasons for choosing Baxter were low cost, safety, and access to the machine's actuators and sensors through the software development kit (SDK), but size, weight, manipulation degrees of freedom, and sensors were considerations, too.

At Tufts School of Engineering, we look at the intersections of robotics, engineering psychology, and cognitive science. So we have big plans for Baxter for research in user interface design, human factors, machine perception, and education. Student Chris Smith (pictured below) has already used the SDK to develop a LabVIEW-based interface for controlling Baxter from a laptop and for viewing and recording images from the robot's five cameras.

Our Human Factors Engineering program, led by professor Dan Hannon, will explore interactions between humans and robots working in close proximity to examine how to differentially allocate tasks to human and machine that leads to best overall performance, such as sorting and opening medical laboratory specimen containers. In the near future professor Chris Rogers will be working to accurately identify and pick individual item from jumbled piles of diverse, complex parts, like the thousands of LEGO pieces in the school's Center for Engineering Education and Outreach (CEEO), where Baxter currently resides. In the longer term, we plan to extend our work on interfaces to the visual activity learning currently under development by professor Matthias Scheutz, director of Tufts Cognitive Science program.

Robots like Baxter capture the imagination of students of all ages, and Baxter is well-suited for education because of the intrinsic safety of its series-elastic actuators and its relative portability. Education for grade school age students is one of the big motivations for developing interfaces to augment those already on the robot, and education is the reason we chose to situate Baxter in the CEEO. Some of the projects on which we plan to have students work are opening doors, unscrewing jars, and manipulating objects suspended by gantry cranes. One somewhat far-fetched idea is mounting Baxter on an electric wheelchair and having it drive by manipulating the joystick.

Robots like Baxter can serve as entertainers, and programming entertainment tasks could be effective educational and research experiences. Some examples are manipulating paper and pencil to play tic-tac-toe or to playing checkers, chess or practically any board game. Performance art is another application—robotic sculptor, robotic painter, robot-robot dance duo, if we get a second machine. For my own part I would like train Baxter to use a slide-rule, the ultimate closure of the computation circle!