{"ajax":"true","data":{"id":"1","name":"Outer Skills","created_at":"2014-02-14 18:46:43","updated_at":"2014-03-05 21:55:22","description":"Tools help us bring ideas to life. From hammers to lasers, they give us new abilities. Today's tools are becoming really advanced\u2013sensors, algorithms and cameras are redefining our reliance on them. They are starting to guide us and teach us, giving us a set of Outer Skills to create new and exciting things.","tagline":"","permalink":"outer-skills","color":"#E5BB45","order":"3","video":{"id":"6","vimeo_id":"86742210","created_at":"2014-02-16 03:09:09","updated_at":"2014-02-26 19:15:04","videoable_id":"1","videoable_type":"Theme","image":"530e3d38084ee_cover-vimeo-outer-skills.jpg"},"principals":[{"id":"2","name":"Embedded Knowledge","description":"

{{type:concept|tag:five-spoons-concepts|imagetag:5311124d3eb46|imgsize:w500|desc:0}}We make tools to help us solve specific problems. They have an embedded knowledge that helps us get things done. Think about a hammer, the claw to help remove nails is a piece of embedded knowledge. Someone solved that problem years ago and built it into the tool. New tools are more dynamic, they are responding to how we work and, in some cases, learning our habits.{{type:concept|tag:five-spoons-concepts|imagetag:5311124d3eb46|imgsize:w740|desc:0|id:concept-five-spoons-mobile}}\r\n\r\n

What could new tools with embedded knowledge look like? Will they read our minds? Would they change behaviors based on context? We wondered how embedded knowledge could help in the kitchen with everyday cooking.\r\nPlay Five Helpful Spoons 0:38","created_at":"2014-02-15 03:39:32","updated_at":"2014-03-05 21:56:49","theme_id":"1","links":[{"id":"12","title":"Liftware","url":"https:\/\/www.liftlabsdesign.com\/","image":"530d6c819b731_448128719-1280.jpg","created_at":"2014-02-26 04:24:33","updated_at":"2014-03-02 11:39:38","principal_id":"0","tag":"liftware-is-a-spoon-that-counteracts-hand-tremors-streamlining-the-experience-for-individuals-with-limited-motor-ability","linkable_id":"2","linkable_type":"Principal","order":"1","description":"works to counteract hand tremors for individuals with limited motor ability."},{"id":"16","title":"Self-Assembling Robots","url":"http:\/\/video.mit.edu\/watch\/small-cubes-that-self-assemble-25913\/","image":"531318cd86603_photo-1.jpg","created_at":"2014-03-02 11:41:01","updated_at":"2014-03-04 04:23:33","principal_id":"0","tag":"self-assembling-robots","linkable_id":"2","linkable_type":"Principal","order":"3","description":"for MIT reconfigure themselves based on their task."}]},{"id":"6","name":"Assisted Mastery","description":"

It's not long until our tools will guide us through our making process. You may start a process, but they will help you refine your work and perfect things as you go. It's not automation. In fact, the tool will give you more confidence to focus on your ideas while it guides you through execution. It\u2019s an idea we\u2019re calling Assisted Mastery.\r\n\r\n\r\n

{{type:concept|tag:iteration-1-concepts|imagetag:53117b3e11afa|imgsize:w1240|desc:0}}As our tools become more powerful, how will we \u201csharpen our skills\u201d? How can we retain originality as they begin to help us with our process? Could our tools teach us? Could they nudge us when we get stuck? We explored this idea by creating an Iteration Table\u2013an experience that will inspire us through the creative process.Play Iteration Table 0:45","created_at":"2014-02-24 16:16:16","updated_at":"2014-03-05 21:58:24","theme_id":"1","links":[{"id":"13","title":"FreeD","url":"http:\/\/web.media.mit.edu\/~amitz\/Research\/Entries\/2011\/11\/15_FREE-D.html","image":"530d7490231c0_shapeimage-3.png","created_at":"2014-02-26 04:58:56","updated_at":"2014-03-04 04:39:13","principal_id":"0","tag":"freed","linkable_id":"6","linkable_type":"Principal","order":"0","description":"is a handheld milling device that guides its user based on a 3D model. It was developed in the MIT Media Lab."},{"id":"17","title":"LittleBits","url":"http:\/\/littlebits.cc\/projects","image":"531319c29ba63_deluxe-kit-bits-942ec0549ca0c167d509ad9f6833c2bf-1.png","created_at":"2014-03-02 11:45:06","updated_at":"2014-03-02 11:45:06","principal_id":"0","tag":"littlebits","linkable_id":"6","linkable_type":"Principal","order":"4","description":"encourages people prototype with electronics without the understanding of soldering or circuitry."},{"id":"40","title":"Smarter Objects","url":"http:\/\/fluid.media.mit.edu\/projects\/smarter-objects","image":"5318c2d4a5c23_imgres.jpg","created_at":"2014-03-06 11:15:31","updated_at":"2014-03-06 18:47:48","principal_id":"0","tag":"smarter-objects","linkable_id":"6","linkable_type":"Principal","order":"22","description":"Valentine Huen, a doctoral researcher at the MIT Media Lab, is exploring the uses of AR technology to program physical objects and their interactions."},{"id":"41","title":"LuminAR","url":"http:\/\/fluid.media.mit.edu\/projects\/luminar","image":"5318c369bd923_smart.jpg","created_at":"2014-03-06 11:17:21","updated_at":"2014-03-06 18:50:17","principal_id":"0","tag":"luminar","linkable_id":"6","linkable_type":"Principal","order":"23","description":"from the Fluid Interfaces Group at the MIT Media Lab, a responsive robotic arm outfitted with a projector and camera that makes anything an interface. "}]}]},"hero":"