Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

My question now is whether all the robots are going to be called “Cassie,” or whether each will (eventually) be renamed when it arrives at its destination. My other question now is whether the first Cassie was “000” or “001,” and also why don’t they think they’ll be making more than a thousand Cassies, because that seems pessimistic.

Yes, that’s right, if you get a Kuri, you are guaranteed a pancake dress up dance party. That song in the background, I was sad to find out, was not written or performed by anyone on the Mayfield executive team (which is what I had assumed). Here it is, in full:

And if you want an actual pancake robot, this is one of the most satisfying robot videos of all time, if you skip ahead to about 1:10 for the pancake pick n’ stack:

Okay, that’s enough robot-pancake YouTube rabbit hole for one Friday. OR IS IT???

Metallica’s European WorldWired tour, which opened to an ecstatic crowd of 15,000 in Copenhagen’s sold-out Royal Arena this Saturday, features a swarm of micro drones flying above the band. Shortly after the band breaks into their hit single “Moth Into Flame”, dozens of micro drones start emerging from the stage, forming a large rotating circle above the stage. As the music builds, more and more drones emerge and join the formation, creating increasingly complex patterns, culminating in a choreography of three interlocking rings that rotate in position.

This show’s debut marks the world’s first autonomous drone swarm performance in a major touring act. Unlike previous drone shows, this performance features indoor drones, flying above performers and right next to throngs of concert viewers in a live event setting. Flying immediately next to audiences creates a more intimate effect than outdoor drone shows. The same closeness also allows the creation of moving, three-dimensional sculptures like the ones seen in the video — an effect further enhanced by Metallica’s 360-degree stage setup, with concert viewers on all sides.

Many elements are needed to create a touring drone show; the drones themselves are just one aspect. Verity’s drones are autonomous, supervised by a human operator, who does not control drone motions individually. Instead, the operator only issues high-level commands such as “takeoff” or “land”, monitors the motions of multiple drones at a time, and reacts to anomalies. In other words, Verity’s advanced automation system takes over the role of multiple human pilots that would be required with standard, remote-controlled drones. The drones are flying mobile robots that navigate autonomously, piloting themselves, under human supervision. The autonomous drones’ motions and their lighting design are choreographed by Verity’s creative staff.

To navigate autonomously, drones require a reliable method for determining their position in space. As mentioned above, while drones can use GPS for their autonomous navigation in an outdoor setting, GPS is not a viable option indoors: GPS signals degrade close to large structures (e.g., tall buildings) and are usually not available, or severely degraded, in indoor environments. Since degraded GPS may result in unreliable or unsafe conditions for autonomous flight, the Verity drones use proprietary indoor localization technology.

Roboticist Cynthia Sung has been doing origami since she was young. She says she was always excited by the possibility of taking something two-dimensional and folding it into a three-dimensional object that can move. Through origami, an ordinary, flat sheet of paper can become a talking frog or a crane that can flap its wings.

But Sung envisioned origami even more complex and functional than 3D shapes and moving animals. Through origami-inspired engineering, she hopes to not only create rapidly fabricable robots capable of completing tasks, but also build intuitive design software that enables others who may not be trained in engineering to create their own personalized origami robots.

Because they can be designed and built so quickly, origami robots are useful in situations which require rapidly deployable robots, such as search and rescue operations.

The robots are also ideal for situations that require compact storage and transport: The robot can be folded into a compact shape or unfolded it into its flat form and transported with minimal packaging. This makes origami robots ideal for space applications where researchers might want to deploy a robot but don’t want it to be too heavy or take up too much room when they launch it.

Origami robots will also be useful for everyday tasks. For instance, Sung says, if a car mechanic drops a tool somewhere that is difficult to reach and doesn’t have the resources to get into a small space, this design software will enable the mechanic to quickly fabricate the tool needed to retrieve the object.

Jerboas are super cute, despite that perma-freaked-out look. Unfortunately, there’s a direct correlation between cuteness and edibility, which makes jerboas all kinds of skittish. As it turns out, they’re very deliberately skittish, constantly changing up their movements to avoid predators, which could have implications for making robots more versatile and lifelike:

The findings may have applications in the field of robotics. Much of the work on robot locomotion involves smooth and predictable motion in low-variability environments. Incorporating unpredictable and variable motion should improve the performance of robots that mimic living organisms and that travel on the same terrain.

"This work points at a new way of building a robot that may be useful in navigating desert environments or desert planets," said Ram Vasudevan, U-M assistant professor of mechanical engineering and a co-author of the Nature Communications paper. "We want to try and build robots that are capable of thriving in all sorts of environments." He recalled a time NASA’s wheeled Curiosity rover got stuck in the Martian desert. "It dug a deeper and deeper hole," he said. "What if they had taken a look at how these rodents travel through the desert? Evolution has found a suitable solution several times over. Until now, we haven’t appreciated how useful a system it is in this context."

There are a lot of walking robot kits out there. But most of them are complicated and challenging to beginners. We decided to create a walking robot that was good for beginners. Something that people could experiment with and learn from without worrying about endangering the robot. So we created the Critter Crawling robot kit. The Critter is entirely 3D printed and based on the Arduino, so anyone, anywhere can find the resources to create him.

Critter is inspired by mudskippers, which is an amazing thing to be inspired by, and you can pledge for one on Kickstarter for $57 if you’re quick, or $65 if you’re not, and the price is a bit lower if you’re willing to 3D print your own parts.

For flexible transport solutions, the IPA experts have developed »Cloud Navigation«. As an example the video shows two mobile, self-navigating systems in a production environment. Since both automated guided vehicles (AGVs), or multiple AGVs in an industrial context, as well as stationary sensors supply their locally-perceived environment data to a central server, the entire fleet benefits from more accurate localization and more efficient cooperative path planning. Intensive algorithmic computations are outsourced to the Virtual Fort Knox. Thus, the AGVs act as »lean clients«, and require less hardware but still retain a high degree of navigation intelligence. Virtual Fort Knox is a federative cloud platform for manufacturing companies.

NASA and ESA plan to send another rover to Mars in 2020. Spacex wants to send one million people to Mars in the next 100 years. However, before anyone sends a rover to another planet, we designed Turtle — a robot to remind you about how beautiful the Earth is.

On Kickstarter, you’re looking at a pledge of about $1,820 for one of these little guys without an arm, or $1,935 with the arm included.

This video realized by the AI Lab at SoftBank Robotics Europe shows how state-of-the-art Deep Learning technologies can be applied to identify traversable terrain from Pepper’s RGB cameras. Three approaches are evaluated: pixel-wise segmentation, pixel-wise depth estimation, and full image classification. The first two approaches are based on off-the-shelf Deep Nets trained on the SUNRGBD and NYUv2 databases. They produce interesting results despite no fine-tuning to Pepper’s hardware. The last approach uses a Deep Net adapted from ALEXNET and trained on a database built inside SBRE offices with Pepper’s cameras. Each approach is successively evaluated by manually moving Pepper. Then autonomous displacements based on the full image classification are presented.

The U.S. Army Research Lab has been collaborating with universities on a research program called Micro Autonomous Systems and Technology (MAST) since 2008. Here are live demonstrations of some of the results, featuring robots (and people!) that we know and love: