Issue Archive

Yayathi: I’m mostly focused on the battery-pack system at the moment. The legs development is also happening, parallel to this, so you’ll see those showing up before too long. We have a prototype already. That’s pretty much absorbing a good chunk of our time at the moment, at least as far as R2 is concerned. Once we move on from that, we’ll be probably transitioning more to focus on EVA development. There are design challenges specific to EVA that need to be tackled. But it’s important to tackle your degrees of complexity in stages. We want to make sure we have a functioning robot that does the things we want it to do, and then take those lessons learned, in addition to the challenges of thermal control, etc. that are required for an EVA robot, and wrap them all into that unit.

NTB: How long have you been working with R2?

Yayathi: I started as a co-op back in early 2006, pretty much the beginning of the Robonaut project. I was fortunate enough to be here from the beginning, right when we were designing that first limb, and it came in stages. We designed and built the arm, and we had an arm just running by itself and a hand. That eventually graduated to doing a whole power system, the head, and the brainstem: the actual main computer that does most our controls work. We’ve evolved quite a bit since the beginning —definitely with getting the buy-in from station, and being able to upgrade and fly our robot.

NTB: How often is Robonaut tested, and how does that work?

Yayathi: There are different stages. During development, we try to segment and test things in pieces before the robot goes together. That was one of my first major tasks as well during the co-op. I was building a simulator of all the power electronics in the robot body, a hardware simulator (not a software one), so we could plug in a joint or plug in a limb, and do a full check out on it. That involves plugging it in, hooking it up to our GUI, and our software control, and making sure it can do all the motions correctly. Then once we have the robot fully assembled, a lot of the sensing [data] is all piped back in through the GUI so we can basically see online if things are going wrong with the robot.

NTB: Are there problems with coordination when you’re adding new functions?

Yayathi: When you have a robot or any other system that’s already flown and is up there, we have to integrate these new features with a robot that exists on station. We’re concerned with how can we make this easy to go together so that the crew can assemble these new components without having to be too invasive. There are areas where, sure, if we built a new robot, we might be able to make something more integrated. We have to make sure that we interface with what we have. So that is a challenge sometimes, too, to make sure we have the right workarounds to get these [components] to connect up. It is highly advantageous for us to be able to do that. Since we have a robot up there, we have to utilize the resources that we have. Hardware is not something that you can just replace on the fly. We do spend a lot of time making sure that these systems are going to integrate.

We have a full cert unit on the ground that is identical to the flight Robonaut 2 that’s on ISS. We’re actually in the process of building a full hardware simulator of the robot so that we can plug in and test all these new components that we build. Eventually, once we’re confident that everything is working correctly, we can then take all the hardware and actually integrate it with that cert flight unit. We can do all our checkups with that on the ground, as well as even the assembly procedures. Before anything ever flies, all of that checkout will be done ahead of time, so that when the crew assembles in orbit, it’ll behave just as we expect.

NTB: How much of commercial industry had a hand in the makeup of the R2? Can you describe the types of partnerships with other industries?

Yayathi: We partnered with General Motors Corporation during the design and build of R2. A handful of their engineers embedded themselves in our lab to work alongside the NASA engineers in order to learn from our experience, as well as impart their knowledge of designing for robustness and reliability. Having GM on the team definitely influenced the design of the robot and the types of things we focused on, including our graphical control interface. We also utilized some of their resources and partner labs to develop custom sensors that are now inside the robot. The Space Act Agreement allows us to form commercial partnerships that benefit both NASA and our commercial partners.

NTB: What’s next for R2? When you look ahead to ten years or so, what do you see as some applications for R2 and similar technologies?

Yayathi: There are just so many things that you have to do when you’re EVA in space, that don’t necessarily require a person in the loop. There are a lot of things that do require a person in the loop, and you want to basically minimize the time people spend doing tasks that can be automated by a robot. Let the people do the things that people are good at. In the future, we’re talking about targeting an asteroid and other planetary surfaces, and that’s a great place for a robot like this. Even if it’s not a Robonaut per se and we take 90 percent of the technology in Robonaut, it can still be a robot that goes out and explores dangerous environments, or unknown environments, before a crew member is sent out there.

Human life is precious. If we lose a robot, we lost some money and some parts, but it’s much less valuable than a human being. If we’re exploring surface and we want to go through some unchartered areas and be able to do a fairly high level of exploration, we can send a robot out there, and then determine that it’s safe and interesting enough for a person to go in after. The robot can also serve as an assistant out in the field as well.

NTB: What’s your favorite part of the job?

Yayathi: My favorite part of this job is honestly the hands-on part. I really like to work with my hands. I like to have tangible hardware. We’re lucky enough to be a part of the entire engineering process and go from design to physical [hardware] sitting in front of you, and it’s pretty rewarding when you spend all this time, thinking and designing things, working with a large team trying to make sure everything’s going to go together just right, and you actually get to have the hardware in your hand and put it together and see it work, and for us, luckily enough to see it fly.

For more information, contact This email address is being protected from spambots. You need JavaScript enabled to view it..

Question of the Week

This week's Question: Last week, Elon Musk, chief executive of Tesla, said that the electric car maker would introduce autonomous technology, an autopilot mode, by this summer; the technology will allow drivers to have their vehicles take control...