Sorry

Microsoft's TechFest 2013 reveals a little of what the company's R&D is up to.

Microsoft this week is holding its annual science fair, TechFest 2013, at which it shows off some of the innovations the company is working on. Some of these are headed for release as products immediately; some are the early stages of cool technology that may or may not find its way into commercial offerings. Click through and enjoy a glimpse of the fruits of Microsoft Research (if offbeat research is more your speed, also review the most recent Ig Nobel awards for research).

This is a digital whiteboard that enables the creation of polished presentations using a stylus on a giant touchscreen. If presenterswant to draw graphs, they draw an L shape on the screen and the screen turns it into a tidy X and Y axis. Labeling the X axis, for example, can be done by writing the first few letters of the unit that X represents. Similarly drawing a circle invokes a pie chart template and drawing a flipped and inverted L produces a world map. The idea is for presenters to draw the charts/graphs they need as they talk but have the software automatically clean them up/populate them with data drawn from a database. See it in action here.

This software enables creation of 3D scanning using a Kinect for Windows motion sensor. Users move the sensors around a group of objects, say a chess board, and the software creates a 3D model of the objects. Possible uses? Better 3D modeling, 3D measuring, industrial design, augmented reality and gaming. The video is a demo of an early prototype scanning Sir Isaac Newton’s death mask. Kinect Fusion is slated to be added to the next Kinect for Windows software developers kit.

This allows hand gestures to replace mouse clicks or touchscreens as a way to perform drag-and-drop and pinch-to-zoom functions. The Kinect motion detector can recognize whether a hand is open or closed, and use that and the motion of the hand to affect what is represented on the screen. Moving two fists apart would zoom in; moving them together would zoom out, just as a two finger pinch does on a touchscreen. It will be included in the next Kinect for Windows SDK. This video is from The Verge.

This Kinect tool enables creation of gaming avatars based on a scan of the gamer’s body. The gamer can modifiy the image made by the initial scan by standing in front of the Kinect sensor and making gestures with their arms to alter the image – make the avatar body parts larger, add tails, wings and other appendages and color in the final creation. Check out the video here.

Visualizing data in ways that are easy and collaborative is the aim of SandDance software. For example it can take U.S. election results by county and represent it on a map of the country, giving a sense of where each party had its strongest support. Using a voice interface users can say a ZIP code and isolate data for that area. Data depicted on a digital whiteboard can be shared with a tablet and modification made on the tablet can be shared with the whiteboard. The goal is to make it easier to see structure in data and easily manipulate it to explore details. See the demo.

This demonstration shows adaptive machine learning in a streaming environment. The example used in a video demo is an automated quality-control check of silicon chips being etched on an assembly line. By feeding the machine data about actual flaws that were found by conventional means on the assembly line, the software could learn more about what to look for. In that way automated inspection machines could find more flaws sooner. Using this method the percentage of flawed chip sheets making it through the manufacturing process dropped from 15% to 1 %.