Surface Developer

måndag 19 september 2011

The release of Surface 2 is getting closer! This summer Microsoft released the Surface 2.0 SDK and with that you can easily create applications to take advantage of the next generation Surface computing device or Windows 7 touch-enabled devices. So don’t let the lack of hardware prevent you from starting those Surface 2 projects that you are planning! Here is a list of resources that you can use to kick start and to prepare for Surface 2.

Microsoft Surface 2.0 SDK and RuntimeWith the Microsoft Surface 2.0 SDK, you can easily create applications to take advantage of the next generation Surface computing device or Windows 7 touch-enabled devices

Microsoft Surface 2 Design and Interaction GuideThe Microsoft Surface 2.0 Design and Interaction Guide helps designers and developers create Surface applications for Microsoft Surface and Windows 7 touch PCs. Developing compelling Surface experiences requires a different approach to interface design. This document presents design principles and guidelines to address key aspects of application interface design including: interaction, visual, sound, text, and more. These principles and practices are a starting point to get the most out of the Surface software and hardware platform’s unique capabilities.

Microsoft Surface 2 Development WhitepaperThis paper provides an overview of the Microsoft Surface application development process. It provides detailed information about the Surface platform and unique capabilities of the hardware. Topics include the Surface 2.0 SDK, vision based touch input, and system architecture. This development whitepaper covers the basic end-to-end process for creating great Surface applications.

Social Stream for Microsoft Surface 2.0Social Stream for Microsoft Surface is a sample application created in collaboration between Microsoft and Stimulant, Inc. It’s an interactive way for businesses to engage their customers face-to-face using the most recent and relevant Twitter, Flickr, and RSS newsfeeds.

söndag 9 januari 2011

The new version of Sonicspree is completely rewritten from the ground and up. It has a new design, new architecture and new frameworks but with the same engaging and social gameplay. The goal of the game is to find a matching album cover to a song playing before one of your competitors does. The faster you do it, the more points you get. If you guess wrong you will lose some of your points.

One of the biggest challenges with the first version of Sonicspree was how to do with music and album covers when installing the application on different units. That version used local mp3 files with embedded graphics. This time we wanted a more flexible solution so we turned to the Swedish online streaming music service Spotify. So this version of Sonicspree is powered by music from Spotify.

Sonicspree was built using Visual Studio 2010 and Blend 4 in a tight collaboration between developers, user interface designers and interaction designers. To make a clear separation in the developer/designer workflow we have used a Model View ViewModel design pattern and the MVVM Light Toolkit.

One game in Sonicspree now consists of several gamerounds and each gameround consists of five songs. The dice still play an important role in Soncispree and they can be rolled before each gameround to decide the genres to use.

When a song starts playing, the players drags the hidden album covers to the center of the table to reveal the cover. When someone finds the correct cover they drag it home to its own nest to make a guess. If it is correct a new song starts playing and if its wrong the nest shakes and spits out the cover.

lördag 8 januari 2011

This week at CES in Las Vegas Microsoft showed the next version of Surface for the public for the first time. It was at the Ballmer keynote that the world got a first glimpse of what’s to come. With the first version of Microsoft Surface Microsoft the hardware was built by Microsoft but this time they have teamed up with Samsung to create the “Samsung SUR40 for Microsoft Surface” witch is the official name. The new unit is pretty much everything you wanted and asked for in an upgrade plus some extra! It is a 40-inch 1920x1280 display covered with Gorilla Glass (same kind of glass as in many smartphones). The first version of Surface has a glass with a matte finish witch is very different from the feel we now will experience.

The new version is only four inches thick and uses a completely new technology called Pixel Sense. With Pixel Sense every pixel acts like a “camera” and can supply information about what’s happening on the glass. All this information is processed in 60 frames per second. Since it is only four inches thick, the SUR40 can also be wall mounted and the SDK will notice at what angle the unit is at and can respond accordingly. It will only detect change in angle in landscape mode but not in portray mode. An application can change its appearance based on how much the monitor is tilted. As a developer you will not get an event when the tilt changed but that is understandable since it still is a pretty large piece of hardware!

There has been a TAP program running for the new Surface since May 2010. Connecta has once again teamed up with Ergonomidesign and collaborated to develop an application on the new hardware and with the new SDK. One of the biggest challenges we have had was the limited access to hardware. The first time we got to see it was in November, on site at Microsoft in Redmond, and we are still waiting to get continuous access to hardware. So we have basically developed everything using only the new SDK and sent bit to the Surface team in Redmond for testing. The reports that we have gotten back has been positive and that really is a good rating for the SDK, i.e. it is possible to develop great apps without having access to hardware. But to get it perfect you still need hardware because you will notice things when several people are using the app at the same time that you never will see in a simulated environment.

Speaking of simulation, the old simulator from v1 is gone. Please welcome the Microsoft Surface Input Simulator!

With the Microsoft Surface Input Simulator, the application is running in full screen on your monitor and input is simulated to let you as a developer work with fingers, blobs and tags similar to as it was done in v1. The input simulator can also simulate monitor tilt in 360 degrees. On limitation so far is that your screen resolution has to be at 1920x1280 to get the exact same view as you will get on real hardware but that high resolution is not a requirement to start your app.

The application that we have created for Samsung SUR40 for Microsoft Surface is a new version of the popular music game Sonicspree. We will write more about that in coming posts but here is the new logo and screen shot.

Please follow #Sonicspree on Twitter for continuous news about where you will be able to experience Sonicspree and Samsung SUR40 for Microsoft Surface during the following months.

fredag 6 augusti 2010

måndag 12 april 2010

Today is a big day for a Microsoft developer like myself. Not only is Visual Studio 2010 released but we also see a long awaited sign of life from the Surface Team! Today, for Surface Parters only, Microsoft releases The Microsoft Surface Toolkit for Windows Touch Beta. This was first announced at PDC and is said to be Microsoft Surface controls, templates, and samples to easily create applications that are optimized for multi-touch interaction and that run on Windows Touch PCs.

The Microsoft Surface Toolkit for Windows Touch Beta is a set of controls, APIs, templates, sample applications and documentation currently available for Surface developers. With the .NET Framework 4.0, Windows Presentation Framework 4.0, and this toolkit, Windows Touch developers can quickly and consistently create advanced multi-touch experiences for Windows Touch PCs. One really interesting part is that this toolkit is supposed to provid a jump-start for Surface application developers to prepare for the next version of Microsoft Surface.

What the next version of Surface is, or when it is, is still hidden for the masses but boy, do I look forward to it!

Will be back with more after I have downloaded and played with the toolkit!

tisdag 2 mars 2010

At the olny Surface session at PDC09, Robert Levy said that they are looking at ways to make Surface smaller, cheaper and vertical. This is of course very interesting and especially to see how they have solved one of the key features of Surface - interaction with physical objects.

This is now starting to bare fruit and rumors are starting to surface that something will appear at TechFest 2010. TechFest is an internal annual event that brings researchers from Microsofts Research's locations around the world to Redmond to share their latest work.

torsdag 11 februari 2010

Recently there have been some talking about OCGM and it’s impact on NUI. OCGN is a design philosophy proposed by Ron George and is suppose to be to NUI in the same way WIMP is to GUI. OCGM is pronounced Occam as in Occam’s Razor and it’s an abbreviation for:

Objects – “Objects are the core of the experience. They can have a direct correlation with something physical, or they can just be objects in the interface.”

Containers – “Containers will be the “grouping” of the objects. This can manifest itself in whatever the system sees fit to better organize or instruct the user on interactions. They do not have to be, nor should they be, windows. They can be any sort of method of presentation or relationship gathering as seen fit.”

Gestures – “Gestures are actions performed by the user that initiate a function after its completion and recognition by the system. This is an indirect action on the system because it needs to be completed before the system will react to it.”

Manipulations – “Manipulations are the direct influences on an object or a container by the user. These are immediate and responsive. They are generally intuitive and mimic the physical world in some manner. The results are expected and should be non-destructive. These are easily performed and accidental activations should be expected and frequent.”

I’ve cited Ron from his first post about OCGM. I recommend you read the post and I also recommend reading this paper about OCGM by Ron George and Joshua Blake.

To understand OCGM further, I would like to make a little retrospective on one my previous Microsoft Surface applications and see how the application fits into OCGM (or should it be the other way around?).

My first Surface project was SonicSpree. To summaries the application: SonicSpree is a game of guessing songs, where the players goal is to combine the song currently playing with it’s corresponding album art. The actual game element is to find the correct album art by and then drag it into the players nest / home. A simple idea. Finding the album art though is like playing Memory. From the start, all album arts cards are facing down but can be flipped by dragging an album art card into the center. When faced up, the player can make a guess by dragging the album art card into his or hers home to receive a point.

If we start with identifying what kind of objects used in SonicSpree the most obvious one is of course the actual album art card the users actually interact with to play the game. The other kind of objects used in SonicSpree are actually the physical dices. A new game round can be started by throwing the dices onto the Surface.

As for the containers, SonicSpree uses two of them: the players nest and “the edge”, as we have called it during development. The nest holds the correct album arts the player have collection and the mysterious edge is actually the container for holding all album art cards that are not currently interacted by the players. As you can see the containers doesn’t resemble each other but they both help organizing the the same kind of objects.

Continuing with manipulations used in SonicSpree and it’s now it gets a bit interesting. First, moving the album art cards. This is probably the most basic manipulation with table based multi touch NUI, especially on Microsoft Surface as ScatterView is very easy and basic control to use. The next part I’m not sure about. Whether the events are counted as many manipulation or if the entire sequence of events is counted as a gesture. What I am refering to is the throwing and removal of the physical dices. First, you as a player throws the dices to randomly select music and secondly removing the dices to start the game. There are actual two most natural manipulations you can make with a couple of dices. But on the other hand, the whole sequence of events (throwing and removing of the dices) can be seen as a gesture as it on completion starts a new game round. Or can it even actually be both?

Talking about gestures, I think I can define two more gestures in the game. First, moving an album art card from the edge to the middle of the screen (illustrated as the circle in the picture above) to flip the card to actually see the album art. Secondly, moving a flipped card into a players nest to make a guess.

I will end my retrospective here and I think SonicSpree did adept the OCGM philosophy quite well, perhaps it was thanks to the UX and design people of Ergonomidesign? But I think OCGM can give us NUI developers the language and abstraction to create NUI applications, not only multi touch ones. Maybe in time we will see more specific design philosophy like WIMP for multi touch NUI, but I like OCGM.