"Martin Kaltenbrunner, co-founder of Reactable Systems, is a Ph.D. candidate at the Pompeu Fabra University in Barcelona, Spain. His research concentrates on tangible user interfaces and human computer interaction in general, topics he has been also teaching at the Kunstuniversität Linz, Universitat Pompeu Fabra and UCP Porto. Recently he has been mainly working on the human computer interaction concepts of the Reactable - an electronic musical instrument with a tangible user interface. He is author of the open source tangible interaction framework reacTIVision and the related TUIO protocol, which has been widely adopted for open source multi-touch applications."

"EDiT has been used for shared sketching and note taking during the 21min event. Gilead Sher, the former Israeli negotiator was one of the guests, showed great interest in the technology and application."Note: Some of the video is in Italian.

Stefano was asked to produce an interactive sketching experience using multi-touch technology to assist the speakers and the chairman of the event (Patrizio Paoletti) in drawing knowledge structures.Pierpaolo Vittoria, a mind mapper, used Stafano's application to record the ideas shared during the 21 minutes event.More about Stefano:Stefano Baraldi's Ph.D. research topic was "TableTop Interaction for the Management of Cognitive Structures", in which he investigated 'the emergent world of TableTop devices and interaction techniques applied to groupware, knowledge management, enhanced meetings and constructive learning."

I am especially looking forward to reading his chapter in an upcoming book:S.Baraldi, “Making the Classroom a play-ground for Knowledge”. Chapter in book “The Classroom of the Future”, curators M kitalo-Siegl, Kaplan, Zottman & Fischer. Pages 30-60 in section “Knowledge building in physical and virtual learning spaces”. Sense Publisher.

SOMEWHAT RELATEDThe video below is a mixed-reality project Stefano worked on during his graduate studies. This project was a collaboration between the University of Florence, the University of Bologna, and other researchers, using a tangible user interfaces:
TANGerINE Inspirational Cube

Fantasy HCI! My wish is to have my own lab so I can create and test out various interactive applications that run on screens of all sizes, and play with new interactive gadgets and displays. I'd also like to provide mobile lab services so I can go out and see how emerging technologies play out in real-life situations and settings during the design & development process as well as after-market.I'd like to focus on social-collaborative & cognitive aspects of emerging technologies. Because of my background in school psychology, I'd work towards ensuring that new applications, technologies, and systems follow the guidelines of Universal Design for Learning as well as Universal Usability. I have some ideas about the transdisciplinary characteristics I'd like to see for members of the lab's Dream Team, but I'm saving that for another post. Now I just need to win the lottery so I can hire my team and run with the ball. Team Charlotte, N.C., anyone?FYI:The HCI link is to a blog that corresponds to the Theory and Research in Human Computer Interaction class at Rensselaer Polytechnic Institute. For more information about HCI, visit the Human-Computer Interaction Resources website.

The video is is of the Stantum Slate PC, via Netbooked's YouTube channel...The system in the video is running on a modded Dell Mini 10, and doesn't require calibration. Notice how the system easily handles a variety of interaction- fingers, thumbs, pinch, rotation, multiple finger swipes, brush strokes, fingernail action, stylus, and more.

Note: The 2010 Interactive Tabletops and Surfaces conference will be held in Germany, and Johannes and others will be involved with running it. You can follow the news on Twitter: http://twitter.com/its_Germany2010. The link to the conference website will be up soon, at http://www.its2010.org/ .

"A multi-touch and multi-user version of the classical Risk game. As a platform Nasa World Wind (WWJ) and the Java implementation of Risk "Domination" by yura.net were used. A authentication method (that was also integrated in the game) can be found in the last (next) video. Thanks to Klaus Drerup & Wadim Hamm." TEAM: Klaus Drerup, Wadim Hamm, Florian Daiber, & Johannes Schoening. Music by cycom: mathematics.

User Authentication on Large Multi-touch Wall with Mobile Device

"The exploitation of ﬁnger and hand tracking technology based on infrared light, such as FTIR, Diffused Illumina- tion (DI) or Diffused Surface Illumination (DSI) has enabled the construction of large-scale, low-cost, interactive multi-touch surfaces. In this context, access and security problems arise if larger teams operate these surfaces with different access rights. The team members might have several levels of authority or speciﬁc roles, which determine what functions and objects they are allowed to access via the multi-touch surface. In this video we present ﬁrst concepts and strategies to authenticate with a large-scale multi-touch wall using a mobile device."

"This video shows the GlobalData application in use on an Archimedes SessionDesk http://www.archimedes-products.com/seThe application was used to illustrate our GeoLens concept. GeoLenses are GUI widgets that can be used like scalable as well as zoomable magnifying lenses to allow synchronous multi-user interaction in GIS systems."

I've been exploring the contributions of artists to the world of interactive digital media. Here are videos of some of the interesting works I've come across recently. Some of the videos are of older works, but were new to me.

INTERACTIVE KINETIC SCULPTUREKinetic Pond

(I'm still searching for more information regarding the Kinetic Pond.)

"Insert 20p and select one of a range of prayers. An interactive sculpture which gives you back the money after providing an interesting message. Warning not to be used by the holy or holey. The prayers were about relationships with various chocolate bar brands." It Pays to Pray Description

Description of Cultural Analytics, from the Software Studies Initiative Website

"The explosive growth of cultural content on the web including social media since 2004 and the digitization efforts by museums, libraries, and companies since the 1990s make possible fundamentally new paradigm for the study of both contemporary and historical cultures. We can use computer-based techniques for quantitative analysis and interactive visualization already commonly employed in sciences to begin analyzing patterns in massive cultural data sets. To make an analogy with "visual analytics," "business analytics," and "web analytics," we call this new paradigm cultural analytics."

"We believe that a systematic use of large-scale computational analysis and interactive visualization of cultural data sets and data streams will become a major trend in cultural criticism and culture industries in the coming decades. What will happen when humanists start using interactive visualizations as a standard tool in their work, the way many scientists do already? If slides made possible art history, and if a movie projector and video recorder enabled film studies, what new cultural disciplines may emerge out of the use of interactive visualization and data analysis of large cultural data sets?"
"The idea of Cultural Analytics was first presented by Lev Manovich in 2005. Software Studies Initiative founded at Calit2 in 2007 made possible to turn this vision into a research program. By drawing on the cutting-edge cyberinfrastructure and visualization research at Calit2 as well as world reputation of UCSD in digital arts and theory, we are able to develop a unique research agenda which complements other projects in digital humanities and "cyberscholarship":

The slideshare presentation is by Anne Helmond, a New Media PhD candidate with the Digital Methods Initiative at the Mediastudies department at the University of Amsterdam where she studied New Media from 2004-2008. She is focusing her work "on the emerging field of Software Studies, which addresses the role that software plays in our society."

The presentation caught my eye because I've been using my blogs as on-line file cabinets, and discovered that my my careful tagging, designed to help me search my own posts, has been something highly favored by search engines. Anne has given this topic some deep thoughts, as you can see from the presentation.Blogging and the blogosphere through the eyes of software and search engines

"Software Studies is a new research field for intellectual inquiry that is now just beginning to emerge. The very first book that has this term in its title was published by The MIT Press in June 2008 (Matthew Fuller, ed., Software Studies: A Lexicon). In August 2008 The MIT Press approved Software Studies book series, with Matthew Fuller, Noah Wardrip-Fruin and Lev Manovich as editors."

"The Software Studies Initiative intends to play the key role in establishing this new field. The competed projects will become the models of how to effectively study “software society.” Through workshops, publications, and lectures conducted at UCSD and disseminated via the web and in hard copy publications, we will disseminate the broad vision of software studies. That is, we think of software as a layer that permeates all areas of contemporary societies. Therefore, if we want to understand contemporary techniques of control, communication, representation, simulation, analysis, decision-making, memory, vision, writing, and interaction, our analysis can't be complete until we consider this software layer. By being the very first center of its kind, The UCSD Software Studies Initiative has the unique opportunity to shape how this software layer will be understood and studied by other universities, programs, and centers in years to come."

"Social scientists, philosophers, cultural critics, and media and new media theorists now seem to cover all aspects of the IT revolution, creating a number of new disciplines such as cyber culture, Internet studies, new media theory, and digital culture. Yet the underlying engine that drives most of these subjects – software – has received little or no direct attention. Software is still invisible to most academics, artists, and cultural professionals interested in IT and its cultural and social effects. But if we continue to limit critical discussions to the notions of “cyber,” “digital,” “new media,” or “Internet,” we are in danger of always dealing only with effects rather than causes; the output that appears on a computer screen rather than the programs and social cultures that produce these outputs. This is why we are convinced that “software studies” is necessary and we welcome you to join us in our projects and activities....“software studies” translates into two complementary research paradigms. On the one hand, we want to study software and cyberinfrastructure using approaches from humanities, cultural criticism, and social sciences. On the other hand, we want to bring software-based research methods and cutting-edge cyberinfrastructrure tools and resources or the study of the new domain where they have not being applied so far – large sets of cultural data."

Nov 25, 2009

What is NexCave? It is an array of LCD panels that provides a projector-free visualization display that enables the visualization of massive datasets in great detail, at high speeds. It was created at Calit2's Virtulab, under the direction of Research Scientist Tom DeFanti. The bonus of this system is that it is much less costly than traditional VR Cave projection systems.

RELATEDJVC Introduces the NexCAVE System"JVC’s Professional Products division is proud to announce today that the California Institute for Telecommunications and Information Technology has developed a new immersive visualization system they call the NexCAVE. This device uses nine GD-463D10U 3D HD monitors to give the user the feeling that they are in the environment. All of these displays feature a 46” diagonal screen, full HD (1920 x 1080) resolution, and 2000:1 contrast ratio. The NexCAVE is created by the same developers who created the CAVE system, which uses 3D projectors to turn a room into a 3D environment. The use of monitors instead of projectors allows for a more compact system that can also be portable for traveling purposes. Unfortunately their isn’t any word when the NexCAVE will be released at this time." -HDTV Review 11/24/09 (Via ITVT)

University of California's Calit2 Develops Immersive 3D Visualization System Using JVC Monitors -Tracy Swedlow, InteractiveTV Today 11/24/09"Calit2 research scientist, Tom DeFanti, and his partner, Dan Sandin, began designing visualization systems over 35 years ago when they co-founded the Electronic Visualization Laboratory at the University of Illinois at Chicago. According to DeFanti, back in 1991 the pair conceived of the original CAVE system using projectors to reconstruct a 3D surround environment. According to JVC, early projector-based virtual reality (VR) systems were generally limited by two major problems: resolution was "fair at best," due to limitations in computer processing power and projector technology; and the systems required a very large dedicated space--since users could block images projected by front projectors, rear projectors, which required sufficient throw distance, were necessary."

Ever since I explored the Hard Rock Cafe Memorabilia website on my HP TouchSmart PC, I've been on the look-out for other great touch-friendly applications created with Microsoft's Deep Zoom and Sliverlight. Today, I came across an example that holds some promise, although it needs some tweaking before it is truly touch-ready.

352 Media Group is a web development firm that has been experimenting with Microsoft's Deep Zoom in Silverlight. The results can be seen on the 352 Media Group Deep Zoom Page. On this page, you can interact with the deep zoom wall. You might need to install a Silverlight plug-in on your browser. Scroll down and read the "How Did We Do It?" section for specifics.

Note: I tried this in three browsers on my HP TouchSmart PC, Google Chrome, Internet Explorer, and Firefox. At the top of the viewing box, it says, "Click inside to zoom in". Clicking the picture or touching my touch screen did not activate the zoom. However, it did enable me to zoom in the wall through scrolling with my mouse.

If you touch the picture with your finger, you can move it around, and you can do this with your mouse as well. At the upper left-hand corner of the frame, there are tiny icons that will allow you to zoom in or out. If the icons were just a little bit larger, with just a little bit more space between them, it would be easy to activate the zoom feature with my finger.

Nov 24, 2009

This demo was created by the Emerging Experiences team at Razorfish. Here's the video description from Vimeo:

"Customers are being faced with increasingly complex buying decisions, especially when it comes to technology and services. As a result, increased pressure is being placed on store associates to provide knowledgeable service to customers. Our Emerging Experiences team used this opportunity to develop a solution to demonstrate how an immersive interactive experience can assist customers and store associates with complex buying decisions in a retail setting."

Nov 23, 2009

I missed this one! The video and photos below are of the Sprint Center Interactive Wall, powered by GestureTek's 3D depth-sensing system. The media art was created by Takashi Kawashima,a designer/media artist who lives in San Francisco. He has an MFA in Design| Media Arts from UCLA.

Since I am usually crunched for time, I thought I'd try posting "morning tech news" on this blog in a brief format, and return to the topic later - hopefully later in the day or at the most, within the week.

If you are familiar with this blog, what I consider "news" is sometimes new to me. It might be something that crossed my path a while ago and never posted. It might be something that I missed. It doesn't even have to be "news", if it is something that is unique, catches my fancy, or is something that I think is an important innovation that should be followed and shared.

"By building thin, flexible silicon electronics on silk substrates, researchers have made electronics that almost completely dissolve inside the body. So far the research group has demonstrated arrays of transistors made on thin films of silk. While electronics must usually be encased to protect them from the body, these electronics don't need protection, and the silk means the electronics conform to biological tissue. The silk melts away over time and the thin silicon circuits left behind don't cause irritation because they are just nanometers thick."

RELATEDWIRED's Gadget Lab: The Illustrated Man: How LED Tattos Could Make Your Skin a Screen Charlie Sorrel 11/20/09"The silk substrate onto which the chips are mounted eventually dissolves away inside the body, leaving just the electronics behind. The silicon chips are around the length of a small grain of rice — about 1 millimeter, and just 250 nanometers thick. The sheet of silk will keep them in place, molding to the shape of the skin when saline solution is added.

These displays could be hooked up to any kind of electronic device, also inside the body. Medical uses are being explored, from blood-sugar sensors that show their readouts on the skin itself to neurodevices that tie into the body’s nervous system — hooking chips to particular nerves to control a prosthetic hand, for example."

"Brian Litt, associate professor of neurology and bioengineering at the University of Pennsylvania, is working with researchers from Beckman Institute at the University of Illinois and Tufts University to develop medical applications for the new transistors. Their silk-silicon LEDs can act as photonic tattoos that can show blood-sugar readings, as well as arrays of conformable electrodes that might interface with the nervous system."

SOMEWHAT RELATED
I've been thinking about flexible touch-screen applications, and it never occurred to me that the concept might be something that would transfer to human skin! Here are a few of my posts related to this topic:

Nov 21, 2009

Sharath Patali, a member of the NUI-Group, has been working with Python Multitouch, otherwise known as PyMT, to create multi-touch applications. He shared a link to a recent post in Make, featuring PyMT. Sharath is the author of the UI Addict blog, and is currently doing his internship at NUITEQ (Natural User Interface Technologies).

I've been told that the beauty of PyMT is that it makes it "easy" to create multi-touch prototype applications using very few lines of code, which is great for trying out different ideas in a short period of time. It helps if you already know Python!

"PyMT is a python module for developing multi-touch enabled media rich applications. Currently the aim is to allow for quick and easy interaction design and rapid prototype development. PyMT is written in Python, based on pyglet toolkit."

Note:
Christopher, author of The Space Station blog, is a member of the NUI-Group, and is building his own multi-touch table running his PyMT-based applications. Christopher is a student in Koblenz, Germany, studying computational visualistics, known as information visualization in the US.

The SMART Table from Smart Technologies now features the Image Reveal application, created by Vectorform, that supports multi-touch, multi-user collaborative learning activities for children. The Image Reveal is the first third-party application published for the SMART Table, and is available for free from the SMART website.

"Vectorform was eager to collaborate with SMART to create an early learning application for the SMART Table, which it feels is a groundbreaking technology product. Image Reveal enables young users to collaborate and answer a series of multiple choice questions in a chosen subject area. Each correct answer uncovers part of a hidden image until it is fully visible. Alternatively, students can guess what the hidden image is at any time to win the game. Using the SMART Table Toolkit, teachers can customize content, including subject area, hidden image, questions and answers, and use images to tailor questions and answers for pre-literate learners." -SMART Tech Press Release

SMART Table Introductory Video:

It is good news to see that SMART Technologies is providing new applications for the SMART Table. There is much room for growth in this field. However, the applications still have the look and feel of electronic workbooks, with a few interactive media bells and whistles tossed in to ensure that the system appeals to young learners. I wonder if the application supports teaching the skills needed for children to successfully work together, such turn-taking, negotiating with other children in a group situation, or settling differences of opinion.

Classrooms in elementary schools now contain a growing number of students who have autism spectrum disorders, as well as other disabilities that interfere with social interaction. For this reason, it would be important to learn if SMART Table applications follow the guidelines for Universal Design for Learning(UDL).

The video below shows people in NYC's Times Square using their Verizon Droid phones to interact with the Verizon Wireless digital signage billboard:

The Droid offers a voice-activated search feature. Users can ask a question, and the search engine, powered by Google, will provide the search results from the web or from items stored on the phone. One feature I like is that it provides turn-by-turn directions from Google Maps, as well as other helpful geographic information. This would be a great tool for city dwellers and visitors alike.

The video below is a demonstration of how the Google Maps Navigation feature works on Android-based phones:

Nov 19, 2009

Tobi, an on-line shopping website, has virtual dressing room with hundreds of dresses waiting to be tried on. Take a snapshot, share it on Facebook, and the process is elevated to a form of social fashionista networking.

The video below explains it all:

I'm not sure if the Tobi website will be offering a virtual dressing room for men.

"This video demonstrates the N-trig DuoSense true multi-touch solution utilizing up to four fingers. The video features various multi-touch enabled applications, including how to pan and rotate using up to four fingers on Google Earth, a demonstration of how to play various onscreen musical instruments using the Snowflake Suite Music application, and a new hands-on way to play Sudoku. The Corel Paint it!™ application shows how existing images can be transformed using multi-touch, and a 3D desktop organizer application from BumpTop demonstrates new and innovative ways in which to organize your desktop using up to four fingers" -avitaintrig's YouTube description

Nov 18, 2009

Times are changing faster than we can change the buzzwords that convey this change. Social Media. Spreadibility. Immersive Journalism.

Henry Jenkins, the Provost's Professor of Communication, Journalism, and Cinematic Arts at the University of Southern California, recently moderated a panel on the topic of social and technological innovations in social media. If you are in the mood for reflection, the videos below of the panel presentations are worth a look. Topics covered include on-line social networks, 3D virtual worlds, immersive journalism, social computing research, "stickiness moving to spreadable", and more.

If you are in a rush, the following article provides an overview of the panel discussions, along with key quotes from the various participants:

Session 1- Video Social Media: Platform or Provocation for Innovation?

Session 2 - Implicatons of Social Media for Business, Learning and Institutional Development

Description from the USC Annenberg YouTube Channel:
Nov. 5, 2009: "Implications of Social Media for Business, Learning and Institutional Development"

"As part of the week-long visit by and dialogue with Annenberg Innovator in Residence Dr. Irving Wladawsky-Berger, Dean Ernest J. Wilson III hosts a half-day conference titled "Social Media: Platform or Provocation for Innovation?" In this panel, "Implications of Social Media for Business, Learning and Institutional Development" experts from USC Annenberg and IBM will explore recent innovations and future trends in the social media space as well as industry responses to these developments. The rate of innovation in social media has been staggering in recent years. The result is a substantially different media landscape than one confronted by media organizations even five years ago. The conversation will focus on both the demands of the new media marketplace and the barriers that organizations are likely to face in attempting to meet these demands. In addition to Wladawsky-Berger, panelists include USC Annenberg faculty members Henry Jenkins, Jonathan Taplin, Dmitri Williams, Marc Cooper, executive in residence David Westphal and research fellow Nonny de la Peña. They will be joined by IBMs Steve Canepa, general manager for media and entertainment and Julia Grace, software engineer and Melissa Cefkin, ethnographer and research scientist."

Nov 17, 2009

HP Announces HP TouchSmart Software Development"HP today announced the HP TouchSmart software development programs that allow software developers to create consumer and commercial applications for HP TouchSmart PCs and touch-enabled digital signage displays. The programs are designed to significantly increase the utility of TouchSmart products for consumers and businesses."

“HP TouchSmart development programs allow developers to uncover new market opportunities while users can discover a whole new world of possibilities on their touch products,” said James Taylor, director, Experience Marketing, Personal Systems Group, HP. “HP’s unique multitouch user interface, combined with native applications, provide the most advanced software platform for touch-enabled PCs and digital signage.”

HP Interactive Solutions ISV Partner Program Overview"The HP Interactive Solutions ISV (Independent Software Vendor) Partner Program allows ISVs to register with HP to access technical resources and support for building tailor-made business solutions for business TouchSmart PCs and touch-enabled and non-touch digital signage displays. In partnering with ISVs, HP provides its business customers more choices and provides a more complete solution for their needs."

Note:

I have an HP TouchSmart - one of the reasons I bought it is that the touch-screen component was made by NextWindow.Nextwindow was the company responsible for the large touch-screen display I used for a couple of projects when I was taking HCI and Ubicomp classes during the first part of 2007. It had the best resolution and touch response of all of the displays I could get my hand on at the time.

NextWindowscreens can be found in new computers and all-in-ones. These include the Dell Studio One 19, Dell SX2210T the HP TouchSmart 300, 600, & 9100, the Sony L Series, the Medion X9613, the NEC's ValueStar W All In One, Leveno A70z All-in-On PC More information regarding Nextwindow can be found on the company's press release page.

For readers who are interested in digging deeper into a topic, I often post video presentations, slides, links to publicly available scholarly articles, references, related news articles, blog posts, and websites, and references.