Note:As of April 18, 2008 this code no longer works due to API changes in VE 3D. We will keep this article up on Coding4Fun.

Introduction

Virtual Earth is the 3D interface to Microsoft's Live Maps service. Normally this control is loaded via the web browser and allows interaction with a keyboard, mouse, and Xbox 360 controller. In this article, we will take the Virtual Earth control out
of the web browser, use it in a WinForms application, and control it with a Nintendo Wii Remote (Wiimote). Note that use of the Virtual Earth 3D control in this way is undocumented and unsupported at the moment. Because of this, some of the descriptions
in this article are educated guesses and may not be 100% accurate.

Setup

Before we get started, you will need to install the Virtual Earth 3D control. If you haven't done this already, browse to
https://maps.live.com/ and click on the Install 3D link to install the control and supporting software.

Additionally, if you haven't already, please review my
Managed Library for Nintendo's Wiimote article on this site. We will be using the library in this article, but I will not repeat the basic information that is located in the original article. Note that this application uses a newer version of the Wiimote
library which is not yet uploaded. It will be available in a few days.

Implementation

The Virtual Earth 3D Control

The Virtual Earth 3D (VE3D) control is intended to be used through a well documented JavaScript interface from a web page, however we would not be able to access the Wiimote from a web browser. Therefore, we will be using the VE3D control through its native,
but wholly undocumented interface. Note that on 10/15/07 a new version of VE3D was released that changed the API drastically from the previous version. This article reflects the newer version.

Start by creating a new Windows Forms application named WiiEarth in C# or VB. As with all controls and 3rd party libraries, a reference needs to be set to the Virtual Earth 3D libraries. Unfortunately this cannot be done from the Visual Studio IDE because
of the way the control is installed to the Global Assembly Cache. So, to set reference to the necessary assemblies, ensure the project is not open in Visual Studio and open the .csproj/.vbproj file in notepad. Add the following XML in the
<ItemGroup> which contains the base references, such as System and System.Data:

With the references in place, the project file can now be opened and the references will be seen in the References folder in the Solution Explorer as usual.

Creating an instance of the control can be done in code just like any other control. Used in the constructor or load event of the form, the following code will create a VE3D control and add it to the form as fully docked:

C#

// the Virtual Earth 3D controlprivate GlobeControl globeControl;

privatevoid MainForm_Load(object sender, EventArgs e){// create a new instance of the VE controlthis.globeControl = new GlobeControl();

This sets up the VE3D control in its default state. If you were to run an application with only this code, you would see nothing but the earth. The navigation controls and other extras would be missing. If you wish to add the default navigation controls
to the screen, the PlugInLoader object is used. The PlugInLoader is created by using the
CreateLoader static method, passing in an instance of the GlobeControl's
Host object. Then, the NavigationPlugIn can be loaded and activated as shown:

The last thing to be added for basic functionality is the data. As it stands, the only data that will appear on the globe is the image of the continents. Zooming in only produces a blurry representation of that base image.

Data layers are created from specially formatted data sources provided by local.live.com known as content manifests. These are XML files which tell the VE3D control how to load the data required for any view. The following helper method can be used to
easily load data layers:

' add it to the globeMe.globeControl.Host.DataSources.Add(layer)End Sub

By passing the URL of the content manifest, a name for the layer, and what the manifest represents, a new
DataSource is created, which is in turn used to create a DataSourceLayerData object which is then given to the VE3D control to consume.

With this helper method in place, we can add any of the following layers (note that there may be other content manifests provided by local.live.com, but these are the only 5 that I am aware of):

If you were to run the application at this point, you would see a fully functioning Virtual Earth 3D control with proper data and navigation.

Control Scheme and Bindings

Controlling VE3D with the Wiimote will be accomplished using a control scheme that is very similar to most first person shooter (FPS) games on the Wii. The nunchuk, held in the left hand, will move the camera forward/back/left/right using the joystick.
The C and Z buttons on the front of the nunchuk will be used to raise and lower the altitude of the camera. The Wiimote, held in the right hand, will be used to change the tilt and turn of the camera. The buttons on the Wiimote will also be hooked up to
several VE3D functions. Home will center the map to an overhead view at the current camera position. The 1 button will toggle through the road layers. The 2 button will toggle the overlaid UI off and on.

VE3D bindings allow you to change or create new control schemes for VE3D. Open your
%APPDATA%\Microsoft\Virtual Earth 3D directory. On Windows XP, %APPDATA% should resolve to
\Documents and Settings\<user>\Application Data . On Windows Vista, it should resolve to
\Users\<user>\AppData\Roaming . In this directory you will find a Bindings.xml
file. This XML schema defines the default keyboard, mouse, Gamepad and other input device properties. Open the file to see the schema used to define events and parameters.

By default, VE3D will load any file named Bindings*.xml from this directory. For the Wiimote control scheme, create a new file named
BindingsWiimote.xml in this directory. Set the contents of the file to the following:

<!-- Nunchuk joystick with modifier since we can move and turn at the same time --><BindEvent="Wiimote.B+Wiimote.NunchukX"Action="Strafe"Factor="1"/><BindEvent="Wiimote.B+Wiimote.NunchukY"Action="Move"Factor="1"/>

<!-- Nunchuk joystick with modifier since we can move and change altitude --><BindEvent="Wiimote.B+Wiimote.NunchukC"Action="Ascend"Factor="0.3"/><BindEvent="Wiimote.B+Wiimote.NunchukZ"Action="Ascend"Factor="-0.3"/>

The <BindingSet> tags wrap groups of control bindings. It requires a
Name and optionally a Cursor. If the binding set is to be used automatically, as it would be in most cases, set the
AutoUse parameter to True. Inside of that are <Bind> tags. The tag requires the
Event, Action parameters and optionally the Factor parameter. The
Event parameter will be used to match the binding to its handler which will be written later. The syntax is <Handler Name>.<Event Name>. The
Action parameter is used to map the specific binding to a particular method. The
Factor parameter is optional and can be used to scale the data value up or down to increase or decrease sensitivity of the input method. Once the handler is written, these will make more sense.

The bindings above create the control scheme described above: NunchukX/Y describe what happens when the analog joystick is moved, NunchukC/Z describe what happens with the C/Z buttons are pressed, and so on.

The bindings also allow for several variations. Bindings are defined for both the IR position (IRX,
IRY) and accelerometer values (AX, AY). If an IR sensor bar is not available, the accelerometer values of the Wiimote can be used instead. Additionally, keyboard bindings are created in the style of a first person shooter using WASD.
These can be used if a Nunchuk is not available.

Note that some bindings append two Events together with a + sign. This allows for button combinations. In this case, for the accelerometer and/or IR sensor, we only want to register the action if a button is pressed down. So, those events which
require the button to be held down contain Wiimote.B+ and the event it is combined with. Also note that the combination events override other events that don't have a combination listed. So, for example, NunchukC is listed as working alone and with
the Wiimote.B event.

For those events which require a custom action that will be written separately and not part of the VE3D control, the
Action parameter must contain the action name, followed by a comma, and then the full assembly name:

You can change any of these button bindings simply by changing this XML file and deploying to the directory above. So, for example, if you wanted the trigger button to be A, you would just change
Wiimote.B to Wiimote.A in the above lines and re-deploy the bindings file.

The XML file also binds several keyboard keys in a first-person shooter style layout in the event the user does not have a nunchuk for the left hand functions.

Event Source

An EventSource is needed which will grab data from the Wiimote and pass it along to VE3D as defined by the bindings file above. Create a new class named
WiimoteEventSource which derives from Microsoft.MapPoint.Binding.EventSource as follows:

Next, add an enumeration named WiimoteEvent (the name isn't important) which contains all of the
Name items from the bindings XML file above. It should look like this:

' can the event be used as a modifier?PublicOverridesFunction IsModifier(ByVal eventId AsInteger) AsBoolean' yes to all for nowReturnTrueEndFunction

' can the supplied event be used as a modifier?PublicOverridesFunction CanModify(ByVal eventId AsInteger, ByVal other As EventKey) AsBoolean' only if it's from usReturn (other.Source IsMe)EndFunction

' this must match the Source name in the bindings XML filePublicOverridesReadOnlyProperty Name() AsStringGetReturn"Wiimote"EndGetEnd Property

With that in place, the constructor can be implemented which will call the base constructor and connect to the Wiimote. It is assumed you read the Wiimote article above and know how the library works.

The constructor must take one argument passed from the main from: an instance of the
GlobeControl's ActionSystem. This just gets passed directly to the parent object's constructor untouched. The constructor code looks like the following:

C#

public WiimoteEventSource(ActionSystem actionSystem, MainForm form) : base(actionSystem){// store away an instance of the main form _form = form;

' if we don't have an extension, set the report type to IR and accel's onlyIf (Not _wm.WiimoteState.Extension) Then _wm.SetReportType(Wiimote.InputReport.IRAccel, True)EndIf

' turn off all LEDs _wm.SetLEDs(&H00)End Sub

The OnWiimoteExtensionChanged method simply sets the report mode for the Wiimote based on whether or not a Nunchuk is inserted as shown:

C#

privatevoid OnWiimoteExtensionChanged(object sender, WiimoteExtensionChangedEventArgs args){// if nunchuk inserted, set the report type to return extension dataif(args.ExtensionType == ExtensionType.Nunchuk && args.Inserted) _wm.SetReportType(Wiimote.InputReport.IRExtensionAccel, true);else// in all other cases, set it to the default IR and accel's _wm.SetReportType(Wiimote.InputReport.IRAccel, true);}

VB

PrivateSub OnWiimoteExtensionChanged(ByVal sender AsObject, ByVal args As WiimoteExtensionChangedEventArgs)' if nunchuk inserted, set the report type to return extension dataIf args.ExtensionType = ExtensionType.Nunchuk AndAlso args.Inserted Then _wm.SetReportType(Wiimote.InputReport.IRExtensionAccel, True)Else' in all other cases, set it to the default IR and accel's _wm.SetReportType(Wiimote.InputReport.IRAccel, True)EndIfEnd Sub

The OnWiimoteChanged event handler is where the Wiimote data is handled and sent off to the VE3D control to reflect the changes. First, let's handle the IR and accelerometer data. The IR midpoint of the X and Y axes will be used from the
WiimoteState object to activate the IRX and IRY events we defined above in the bindings XML file. The accelerometer X and Y values will be used to activate the
AX and AY events.

This snippet assumes that there is a boolean property named UseIR created in the project to determine whether IR or motion values are used. Additionally, it assumes there are property settings created which contain values for the X/Y "dead zones"
for the IR and accelerometers. These dead zones are used as a way to only activate the event when the values are pushed beyond the thresholds. This allows there to be a margin where the user's hand will not be read as movement, allowing the user to not have
to worry about keeping a steady hand.

The application linked above uses the following values for dead zones:

// save the last IR settings...these get used if we go beyond the range of the IRs.// in that case, the last used positions will be used until the Wiimote comes back in rangethis._lastIRX = x;this._lastIRY = y; }else// one or both LEDs aren't seen {// activate events based on the last known positionsif(this._lastIRX > Properties.Settings.Default.IRDeadX || this._lastIRX < -Properties.Settings.Default.IRDeadX)this.Execute(new AxisEventData(new EventKey(this, (int)WiimoteEvent.IRX), this._lastIRX));if(this._lastIRY > Properties.Settings.Default.IRDeadY || this._lastIRY < -Properties.Settings.Default.IRDeadY)this.Execute(new AxisEventData(new EventKey(this, (int)WiimoteEvent.IRY), this._lastIRY)); }}else// we're using motion controls{// activate the events based on the accelerometer valuesif(ws.AccelState.X > Properties.Settings.Default.WiimoteDeadX || ws.AccelState.X < -Properties.Settings.Default.WiimoteDeadX)this.Execute(new AxisEventData(new EventKey(this, (int)WiimoteEvent.AX), ws.AccelState.X));if(ws.AccelState.Y > Properties.Settings.Default.WiimoteDeadY || ws.AccelState.Y < -Properties.Settings.Default.WiimoteDeadY)this.Execute(new AxisEventData(new EventKey(this, (int)WiimoteEvent.AY), ws.AccelState.Y));}

' save the last IR settings...these get used if we go beyond the range of the IRs.' in that case, the last used positions will be used until the Wiimote comes back in rangeMe._lastIRX = xMe._lastIRY = yElse' one or both LEDs aren't seen' activate events based on the last known positionsIfMe._lastIRX > My.Settings.Default.IRDeadX OrElseMe._lastIRX < -My.Settings.Default.IRDeadX ThenMe.Execute(New AxisEventData(New EventKey(Me, CInt(Fix(WiimoteEvent.IRX))), Me._lastIRX))EndIfIfMe._lastIRY > My.Settings.Default.IRDeadY OrElseMe._lastIRY < -My.Settings.Default.IRDeadY ThenMe.Execute(New AxisEventData(New EventKey(Me, CInt(Fix(WiimoteEvent.IRY))), Me._lastIRY))EndIfEndIfElse' we're using motion controls' activate the events based on the accelerometer valuesIf ws.AccelState.X > My.Settings.Default.WiimoteDeadX OrElse ws.AccelState.X < -My.Settings.Default.WiimoteDeadX ThenMe.Execute(New AxisEventData(New EventKey(Me, CInt(Fix(WiimoteEvent.AX))), ws.AccelState.X))EndIfIf ws.AccelState.Y > My.Settings.Default.WiimoteDeadY OrElse ws.AccelState.Y < -My.Settings.Default.WiimoteDeadY ThenMe.Execute(New AxisEventData(New EventKey(Me, CInt(Fix(WiimoteEvent.AY))), ws.AccelState.Y))EndIfEndIf

This code looks at the appropriate values, determines if they are beyond the specified thresholds for the dead zones, and, if they are, activates the event for that value using the
Execute method. Execute is a method in the base EventSource class. This method will activate the event specified from the enumeration (which, remember, is contained in the bindings XML file) with the value associated with that event.
An EventData object of some type must be created and passed to the Execute method. There are two
EventData types to know about: AxisEventData and ButtonEventData.
AxisEventData should be used when an event is activated that will modify the map position in some way. That is, if the map is being turned, elevation is changing, etc.
ButtonEventData should be used if the event is a simple toggle like pressing a button down and releasing it.

Next, the nunchuk values need to be read and the associated events activated. This is done as follows:

Finally, the button events need to be activated. A helper method which will check the current button state will be used for determining which button of all the Wiimote buttons is pressed. For those that are, the appropriate event is activated with a call
to Execute.

' save off the current button state for next time_lastBS = ws.ButtonState_lastNunchuk = ws.NunchukState

And, the current button values are stored away to check on the next event so button events are only fired once.

Now that the event source object is written, it needs to be hooked up to the
globeControl so it can be used. This can be done by creating an instance of the
WiimoteEventSource object, passing in the VE3D's ActionSystem from the
BindingsManager object. Then, the event source instance is passed to the
ActionSystem's EventSourceManager and registered using the RegisterEventSource method. Event sources should re registered before the control is added to the form.

C#

// wiimote eventsprivate WiimoteEventSource _wiimoteEventSource;

...

// create a new instance of the Wiimote event handlerthis._wiimoteEventSource = new WiimoteEventSource(this.globeControl.Host.BindingsManager.ActionSystem, this);

// register it in the event source listthis.globeControl.Host.BindingsManager.ActionSystem.EventSourceManager.RegisterEventSource(this._wiimoteEventSource);

VB

' wiimote eventsPrivate _wiimoteEventSource As WiimoteEventSource

...

' create a new instance of the Wiimote event handlerMe._wiimoteEventSource = New WiimoteEventSource(Me.globeControl.Host.BindingsManager.ActionSystem, Me)

' register it in the event source listMe.globeControl.Host.BindingsManager.ActionSystem.EventSourceManager.RegisterEventSource(Me._wiimoteEventSource)

Our binding list contains three action types that are not defined by the default VE3D actions:
ToggleRoads, Locations, LocationsMove, and ToggleUI. These actions and their handlers must be registered with the VE3D control. After the
WiimoteEventSource is registered, the four actions can be registered as follows:

C#

this.globeControl.Host.BindingsManager.RegisterAction(asmName, "ToggleRoads", new Action(this.ToggleRoadsHandler));this.globeControl.Host.BindingsManager.RegisterAction(asmName, "Locations", new Action(this.LocationsHandler));this.globeControl.Host.BindingsManager.RegisterAction(asmName, "LocationsMove", new Action(this.LocationsMoveHandler));this.globeControl.Host.BindingsManager.RegisterAction(asmName, "ToggleUI", new Action(this.ToggleUIHandler));

VB

Me.globeControl.Host.BindingsManager.RegisterAction(asmName, "ToggleRoads", New Action(AddressOfMe.ToggleRoadsHandler))Me.globeControl.Host.BindingsManager.RegisterAction(asmName, "Locations", New Action(AddressOfMe.LocationsHandler))Me.globeControl.Host.BindingsManager.RegisterAction(asmName, "LocationsMove", New Action(AddressOfMe.LocationsMoveHandler))Me.globeControl.Host.BindingsManager.RegisterAction(asmName, "ToggleUI", New Action(AddressOfMe.ToggleUIHandler))

With the actions registered and handlers associated with them, the actual handlers need to be implemented. All event handler methods must be of the following signature:

The implementations above simply check to see if the event is being activated and, if so, removes the current road layer and adds the next one in the series. The one thing to keep in mind is that the event handler methods are called inside the rendering
thread of the VE3D control, which is not the thread the windows form UI is located on. Therefore, if the form UI needs to be updated in any way, one must use the
BeginInvoke method and a delegate method to update any UI controls.

This code simply changes the boolean values of several UI items. The WorldEngine object contains several other UI elements. The
ShowUI property overrides all other properties and determines if anything is shown at all. This property also determines whether the globe in the lower-left corner is displayed.

Be sure to check the source code for the full demo linked above for the location handler methods. I omitted them here since it is just more of the same type of code above.

Running the Demo

Pair the Wiimote to the computer. See the
WiimoteLib article for more information on how to do that

Run the executable

Conclusion

With the above code, we have written a Wiimote-driven interface for Virtual Earth 3D. The demo and source code linked above contain a few more features and bindings which enhance the application a bit more. Be sure to give the full demo a try and check
out the full source code for a few more implementation details.

Additional Information

Thanks

Thanks to Michelle Leavitt and Giovanni Montrone for testing the control scheme and helping to determine the best feel using the Wiimote.

Bio

Brian is a Microsoft C# MVP and a recognized .NET expert with over 6 years experience developing .NET solutions, and over 9 years of professional experience architecting and developing solutions using Microsoft technologies and platforms, although he has
been "coding for fun" for as long as he can remember. Outside the world of .NET and business applications, Brian enjoys developing both hardware and software projects in the areas of gaming, robotics, and whatever else strikes his fancy for the next ten minutes.
He rarely passes up an opportunity to dive into a C/C++ or assembly language project. You can reach Brian via his blog at
http://www.brianpeek.com/.

The recent Virtual Earth update has broken this demo. The executable will not run saying it cannot find certain resources. Checking the Global Assembly Cache shows the Microsoft.MapPoint assemblies have be updated from 2.0 to 2.5.

The project will no longer compile, showing errors with Microsoft.MapPoint.Binding.BindingsManager.RegisterAction, Microsoft.MapPoint.Rendering3D.Host, etc.