We were honored to be invited to speak at this year's event, and delighted to meet so many inspiring people making the world a better place through Open Hardware. The OHS will be posting video of the talks soon, but you can also a summary from this recap from making society.

We also had a wonderful conversation with Alicia Soria of Postdigital Node on how robotics can blur boundaries between analog and digital domains:

We also want to give a final special thanks to Makerfaire Rome for sponsoring the summit, the organizers of this year's event (Addie Wagenknect, Simone Cicero, and Nahid Alam) and the Ada Initiative, an organization supporting women in open tech and culture, for granting Madeline one of this year's Ada Lovelace Fellowships.

Industrial robots (IRs) are expensive, closed, proprietary machines. Traditionally, these specialized CNC machines were limited to industrial & manufacturing applications. They are extremely useful in scenarios where it is safer, cheaper, faster, or more efficient to use a robot instead of a human: not only can IRs lift heavy equipment, handle dangerous materials, and move with extreme precision at rapid speeds, they can also work 24/7 with minimal maintenance.

The impact of industrial automation has been felt in the US workforce for nearly half a century. In 1972, UNIMATE, the first industrial robot, joined the General Motors assembly line to automate dangerous processes in die-casting. Although mechanized manufacturing had been around for over 200 years, the integration of UNIMATE marked the first time a robot –a programmable arm – shared the same workspace as a person. Rather than augmenting or streamlining human labor, it replaced us. UNIMATE signaled the larger cultural implications for the future of robotics: robotics would no longer be a subject science fiction or laboratory research; it would be an everyday reality of the workplace.

George Devol's UNIMATE: early generation arm in 1961 (left) and the robotic workforce on GM's assembly line in 1972 (right).

The past decade has seen dramatic job displacement in the global manufacturing workforce. Industrialized nations are incorporating industrial robots at much higher rates than human counterparts (see bottom-left figure.) Moreover, experts are anticipating that these job displacement trends are transferring to workforces outside of manufacturing. A 2014 Pew Research Center Report – AI, Robotics, and the Future of Jobs – surveyed nearly 2,000 experts on the near-future impact of robotic advances and AI on the global workforce. The three overarching concerns from respondents were:

Impacts from automation have thus far impacted mostly blue-collar employment; the coming wave of innovation threatens to upend white-collar work as well.

Certain highly-skilled workers will succeed wildly in this new environment—but far more may be displaced into lower paying service industry jobs at best, or permanent unemployment at worst.

Our educational system is not adequately preparing us for work of the future, and our political and economic institutions are poorly equipped to handle these hard choices.

CHANGING THE STATUS QUO

While the momentum pushing us towards an automated workforce seems daunting, now is the time to reframe the existing relationships between people and industrial robots. The current model within industrial manufacturing places IRs as adversaries – "they are coming for our jobs" as described by Wired, Scientific American, MIT Technology Review, and countless other technology magazines. However, adapting IRs for uses outside of manufacturing domains can change industrial robots from adversaries into collaborators. Instead of developing ways for IRs to replace human labor, a collaborative model can create ways for IRs to augment, amplify, and extend human capabilities and creativity.

Existing creative applications of industrial robots tend to fall into three categories: fabrication, hybrid art, and interaction & telepresence. Below are brief descriptions of each, and a few notable example projects.

Fabrication

Industrial Robots are used to revive high-skill crafts that have been supplanted by industrial manufacturing and mass production processes: for example, carpentry, masonry, or plastering.

HYBRID ART

Industrial robots are used in combination with traditional artistic practices: for example drawing, sculpting, painting, or performance.

INTERACTION & TELEPRESENCE

Industrial robots are used used to intervene in social settings or as a physical proxy for remote users.

I'm very fortunate to have access to a multi-material 3D printer for the coming months. Thus far, I've only worked with powder and plastic-based 3D printers, so I'm eager to find the advantages and limitations of resin-based printers. For the next few months I'll be working with an Objet260 Connex, a polyjet printer that allows you to mix two materials to create up to 14 different material properties).

As my first test with the machine and workflow, I decided to try to recreate a small material sample from Neri Oxman and Iris van Herpen's Anthozoa Cape & Skirt. It was printed by a similar machine, although using different materials than what I have available.

I created a simplified parametric model of a polyp colony, using Tango Black (rubber) for the base and tops, and Vero Clear (rigid plastic) for the shells.

If you have Rhino5 and Grasshopper, you can download this .zip file to play around with the script. Adding/moving points will change the shape of your colony.

A few things learned from the printing process below:

Although you can mix the two materials to create 14 material properties, there's no default way to actually blend materials. So if you want a gradient from soft to hard or light to dark, you need to plan your digital geometry to get the desired material effect.

Objets use a ton of support material. It's very difficult to remove from complex/porous/hollow geometry, and you risk damaging any delicate detail in the process.

Provide decent surface area when changing materials. If you notice in the images below, the smaller polyps lost their rubber tops during cleaning while the larger ones were able to survive. All the polyps have strong connection to the rubber base.

Morning Breath is a physical app that connects two people in different time zones. The project was created for the Motorola and 3D Systems sponsored MAKEwithMOTO tour, and was inspired by the customizable hardware of the newly released MotoX phone. Morning Breath combines hardware and software to craft a hybrid digital-physical experience. The cell phone acts an a tangible interface for transmitting the sensation of sleeping next to your loved one, no matter where they are in the world. The app listens to the sound of one partner's gentle breathing while sleeping, then transmits the rhythm to the other partner's phone. A micro-fan embedded into a custom 3D printed case then pulses to recreate the subtle experience in the hand.

The project was feature on 3D systems' blog – see the full write up here!

A HACKDAY organized by Plot London for Berg's incredible new tangible interface, the Little Printer. Plot held the workshop at CMU's C0DELAB, where a group of architects, engineers, designers, and artists got together to prototype new publications for the little printer.

Our relationship to the built environment is becoming increasingly reliant on pervasive digital technology and decreasingly reliant on the physical embodiment of place. The digital devices we append to our bodies, embed in our spaces, and lay as ubiquitous infrastructure are becoming the interface that mediates our interactions with the world around us. THIS COULD GET WEIRD examines the value of reclaiming this interface as an architectural territory. The lecture argues for integrating a transdiciplinary and anti-disciplinary ethos into the conservative domain of Architecture, and discusses the messiness that would consequentially ensue. Lastly, the lecture demonstrates current design research that focuses on the interface as a primary facilitator for interstitial interactions between digital and analog datums.

SeaLegs, a responsive environment for “Mollusk-Aided Design”, harnesses the power of a simulated virtual squid to generate baroque and expressive spatial forms. Specifically, the project uses “chronomorphology” — a 3D analog to chronophotography — to develop complex composite forms from the movements of a synthetic creature. Within the simulated environment the creature can be manipulated for formal, spatial, and gestural variation. Internal parameters (the number of legs and joints per leg) combine with external parameters (such as drag and repulsion forces) to create a high level of control over the creature’s responsiveness and movement through the virtual space. As the creature’s movements are traced through space and time, its familiar squid-like motion aggregates into unexpected, intricate forms. The resulting forms are immediately ready for fabrication, and can be exported to high-resolution 3D printers.

Machine Drawing Drawing Machines, a project for Pablo Garcia, is a narrative folio in which the patents of 12 historically significant drawing machines are recreated with an industrial Computer Numerical Control (CNC) machine. Although CNC drawing is well trodden territory, this project carries poetic undertones in the ironic use of a highly specialized and powerful piece of manufacturing machinery to achieve a level of detail and accuracy that the last five centuries of drawing machines could only aspire towards.

A speed project with the incredible Addie Wagenknecht of NORTD Labs during her one-week residency at the STUDIO for Creative Inquiry. For the micro-residency, Addie wanted to do a second installment of her 'Optimization of Parenthood' series ... a baby-rocking industrial robot! To make this a reality, we put together a small program to make our ABB IRB 140 gently rock a bassinet back-and-forth, then a bit more rigorously when the baby stirs:

MADLAB was invited to lead a 3 day intensive workshop in physical computing for Florida International University’s School of Architecture. Over 22 hours, 18 architecture students and 3 professors learned how to translate physical input into meaningful embedded interaction with the built environment. The workshop introduced the Arduino prototyping platform, as well as techniques in soldering, electronics, programming, hacking, and debugging. The participants left with the knowledge to choreograph kinetic systems using sensors and actuators for affective interaction.

MADLAB will be traveling to Miami, FL this July as an invited critic for project 41: a tale of cities … that share one street. Presentations will review the nomadic design studio’s investigations into novel techniques in urban documentation and speculative mobile infrastructure. Project 41 is coordinated studio between the Department of Urban Speculation at the University of Illinois Chicago, Andrew SantaLucia, and Malik Benjamin for Florida International University’s 4th Year Accelerated Masters Architecture Studio.

This speed project with Marynel Vázquez was inspired by the work of a talented community of artists and designers that are using video mapping as a medium to reinterpret and transform banal, expected environments. The work of Pablo Valbuena was a strong influence over our explorations, and we sought to introduce dynamic interactivity to augmented sculpture as our novel addition to this community.

With the relative novelty of the KINECT, our response to its inherent screen-based interaction was to pull it back out into the physical realm. Developed in C++, Openframeworks, and OpenNI, we are using the depth mapping capabilities of the KINECT to evaluate the participant’s hand, and position it as the light source of the physical model. In effect, their hand becomes the sun, lighting or dimming our abstracted cityscape, and blurring the border of virtual and actual.