Please accept our profound apologies for not posting for the last few days: we've been working very very hard (we promise!) at RoboBusiness 2013, and next week we'll have some good stuff for you from the show. If you pay close attention, you may even see a few videos sneak in today, too. After next week comes the IEEE International Conference on Intelligent Robots and Systems (IROS), when our posting schedule will probably get thrown out the window (in a good way!), as we bring you the latest robotics research live from Tokyo. But we're not there yet: first, we have to make it through two more Video Fridays.

Announced at RoboBusiness this week was the launch of a Kickstarter for the RoboKind Zeno R25. We've written about the R50 before, and the R25 shares much of the same features, except it costs less. A lot less. We're talking $2,700 instead of $15,000, for a small walking humanoid robot with sophisticated voice recognition and a face capable of making human-like (ish) expressions:

The R25 needs just $50,000 on Kickstarter to be successful, and we're certainly wishing them success.

In other announcements, iRobot has started giving its cute little FirstLook robots some payloads, including a nifty little arm that can pick up three pounds. Not bad for a robot that weighs a total of just 2.3 kilograms (5 pounds):

And yes, we did notice that when the FirstLook was thrown into the window, it wasn't carrying anything. Hmm.

SLIP based Human-Inspired Control applied to the compliant underactuated bipedal robot ATRIAS. Using bio-inspired virtual constraints and canonical walking functions, the full-order robot can be reduced to a low-order model. Walking gaits are generated through the use of a novel multi-domain optimization that leverages this low-dimensional representation. These gaits are verified in simulation, and implemented on the ATRIAS. The end result is remarkably natural looking walking.

And now, AMBER 2:

Demonstration of human-like multi-contact locomotion on the bipedal robot AMBER. In particular, as inspired by human-locomotion, the robot demonstrates three phases of walking throughout the walking gait characterized by changing contact points at the heel and toe. Furthermore, these changing contact points result in different three different types of actuation throughout the walking gait: full actuation, underactuation and over actuation. The end result is human-like locomotion on the robot.

UAVs are notoriously hard to control, but putting them on a leash makes them a heck of a lot easier to manage:

We should have more on this brilliant idea from Sergei Lupashin at IROS.

ASK NAO (Autism Solution for Kids) was created by Aldebaran Robotics to customize NAO, our humanoid robot, in order to support teachers with in-class tasks and help children with autism reach new levels of greatness.

This initiative was developed after noticing that many children with Autism seem impulsively attracted to technology therefore allowing NAO to become the perfect bridge between technology and our human social world.

ASK NAO clears the path for a revolution in thinking, driven by those who are most intimate with Autism and technology. Altogether with NAO, we can shape the special education world of tomorrow for the best of the children.

To accomplish this Aldebaran Robotics is creating a multi-sided community made up of developers, therapists, researchers, teachers, parents, enthusiasts, and the Aldebaran team collaborating to help children at surpassing their limits!

Last week, DARPA hosted a public presentation day for all the Track A and Track B/C teams that'll be competing in the DRC this December. Not all of those presentations have been put online (yet), but here's one from Team DRC-Hubo: