I don't think I can ever go back to 2008/2010. That one single button makes sifting through a 48-project solution (which Maestro currently is) an absolute breeze! Navigating through 48 projects worth of source code in previous versions of Visual Studio was a complete chore. In VS 2012 it's dead simple.

As an aside, the Windows Desktop Express edition of VS 2012 is quite an impressive piece of free (as in beer) software. I've been hacking on Maestro trunk with VS 2012 Express ever since it was released and I've had some of the most productive coding sessions with it.

I was expecting a bare-bones front-end to csc/vbc/cl.exe like previous Express editions with all sorts of SKU-related hurdles to overcome, but this is something out of the ordinary:

This was stuff I was expecting from a Professional level SKU in previous versions of Visual Studio! I don't really care about MSTest (I use NUnit/xUnit externally), nor do I care about TFS integration (I'm use svn/git/mercurial externally). Actually I don't really care much about whatever integration Microsoft is marketing for their higher level SKUs as I am happy to get at such functionality through external tools. Refactoring support is very basic, but still workable (ReSharper is nice, but I don't need it like a crutch in order to crank out good code).

So what Microsoft has actually offered in their 2012 Express editions actually meets most of my needs, which is especially surprising in light of their initial snub to desktop application developers (you know ... the people who make their platform). In their efforts to appease the many pissed off developers (like me) they punched well above their weight and did much more than I would've personally expected them to do.

So kudos to Microsoft for delivering a fine set of free developer tools.

But if you think this is an olive branch manoeuvre that will get me to code for Windows 8? Dream on! :P

* Found out a little hitch in the round-tripping The round-tripping breaks if you add a new project from VS 2012, as this upgrades the solution file to VS 2012 format. But the actual changes are minute and can be safely reverted (just need to revert the header portion of the solution file) to allow that solution file to be opened back in VS 2010.

If you look at Maestro from a high level view, it shares many qualities and similarities to an IDE (minus the bits that edit and compile source code). As a matter of fact, Maestro re-uses a lot of code from the premier open source .net IDE: SharpDevelop

In fact, I would imagine, if you stripped out all the MapGuide authoring stuff out of Maestro and replaced it with source code editing and build tools, you would end up back with SharpDevelop itself. The design/architecture is (intentionally) that similar!

Of particular focus in this post is Maestro's use of the text editor component from SharpDevelop.

Currently this text editor is only being used for our generic XML editor, giving us nice syntax highlighting and other things that you would expect from an industrial strength text editor. The problem however is that there are many other areas in Maestro that could benefit from using this text editor. In addition, there are many IDE-isms present in SharpDevelop that would be of great use in the context of Maestro.

So for the next beta we are going all the way and using this component wherever we possibly can, due to its powerful syntax highlighting, auto-completion and other assorted features you expect from an industrial-strength text editing component.

IronPython Console

We've thrown out the existing IronPython REPL console and replaced it with one grafted from SharpDevelop's python addin.

We lose some of the built-in commands from the old one, but because we're using the SharpDevelop text editor, we now have things like Python syntax highlighting and the more important one: auto-complete

And we get nice colorful feedback about any errors that IronPython gives us

Also the design of this REPL means that python snippets are executed in a background thread ensuring the main UI is not clogged up. However if you are executing python snippets that create or interact with Windows Forms controls, that code has to run on the UI thread, so we've provided a UIInvoke function in the host application to let you do this.

Personally, I'm loving this REPL that's been implemented. It's a wonderful way to not only explore the Maestro API from within Maestro itself, but it's also a nice convenient way for me to test new functionality and/or newly implemented APIs.

Generic XML Editor
The generic XML editor was retro-fitted with the SharpDevelop text editor component since beta 2, but for the next beta, it now supports code folding (grafted from SharpDevelop's XML addin). This allows you to collapse vast swaths of XML content to make editing and navigation that much simpler.

Expression Editor
The Expression Editor basically did what the SharpDevelop text component did, but much more crudely. So we've replaced the text component in the Expression Editor as well. We get proper auto-complete behavior not just for properties and functions, but also operators (LIKE, IN, etc)

One small caveat with this implementation currently is that the auto-complete is case sensitive and will only trigger if we have a partial case-sensitive match on potential auto-complete result. We'll look to make this eventually case-insensitive.

The next beta that packs these new features (among other things) will hopefully be done by the end of this week (no guarantees)

All these files are combined into fusionSF.js. fusionSF-compressed.js is this fusionSF.js file run through the YUI compressor to strip out whitespace and other extraneous characters to give us a minified, production-ready version of fusion.

Why do we do this? Why don't the templates just link fusion.js and demand load the remaining scripts?

Download size. Minified scripts (combined with web server compression) results in the smallest possible amount of data that needs to travel through the wire. Compare a 515KB download of fusionSF-compressed.js vs a 1MB download of fusion.js and all of its supporting scripts.

Reduces the number of HTTP requests made. Doing a single request for a minified and compressed version of fusion beats doing 30-odd requests for fusion.js and all of its supporting scripts, especially when you then multiply the number of concurrent users and factor in network latency

So it's clear that fusionSF-compressed.js, is a good thing. So what's the problem?

What if we want to modify fusion? Suppose we want to roll in some enhancements or fixes for fusion. What files would we actually edit?

fusionSF-compressed.js?

fusionSF.js?

fusion.js and/or its supporting js files?

I hope you all answered 3. If you answered 1, just take a look at fusionSF-compressed.js

If the gibberish nature of the file contents didn't give it away, you're not supposed to edit this file. This file is effectively the "compiled" version of fusion. For those who answered 2, yes the file in question is no longer gibberish but this is nothing more than an intermediate file that gets turned into the final fusionSF-compressed.js. Even if your templates are linked against fusionSF.js you would still be serving out an un-minified version of fusion. Remember why the templates are linked to fusionSF-compressed.js by default. Minimal requests and minimal download size.

You wouldn't edit the raw bits of an program executable by hand would you? No. You want to be editing the source code and then re-compile a new executable, which is the same thing you want to be doing with fusion as well. So the correct answer is indeed 3. You want to edit fusion.js and/or its supporting js files with your enhancements and/or fixes (temporarily switching your template to use fusion.js to identify these bugs, fixing the appropriate files, and testing your changes) and when everything's all good, "compile" a new fusionSF-compressed.js that incorporates your changes.

So how would you go about compiling such changes? Well this is why this post is long overdue. Because before the 2.4 release of MapGuide Open Source, there was no simple way to re-compile a new compressed version of fusion on a production installation. You would've had to:

Switch over or have this checkout version running side-by-side with your production fusion.

Try to reproduce the problem there, or to test your enhancements there.

Build a new compressed version of fusion

Re-test your changes against the compressed version of fusion

Deploy this new build of fusion to your production installation.

For people who have strong backgrounds in web/software development, this is your run-of-the-mill build process that's not overly fancy. For those who don't, you'd probably be confused with what I just listed.

So for the 2.4 release of MapGuide, we've provided the fusion build tools in a zip file. This zip file can be extracted into your production fusion installation and you have all the necessary build scripts and utilities needed to "re-compile" a new version of fusionSF-compressed.js that incorporates any customizations you have made. All the build tools require is that you have a copy of Apache Ant installed. Please do read up some of the ant documentation so you know how this tool works.

So with these build tools extracted to your fusion installation and Apache Ant installed, the workflow for patching/modifying fusion is as follows:

Switch your template(s) over to fusion.js so you are working against the raw source files

I haven't been in the Autodesk partner game for quite some time now, so my knowledge about their commercial products (besides AIMS) have atrophied somewhat over time. Nevertheless I do keep in touch with the Autodesk blogosphere, and one particular blog post today caught my attention.

The reason it caught my eye, is that it turns out we can do pretty much the same thing in MapGuide through the combination of Google Earth and MapGuide's KML support. For this to work, you will need the recently released 2.4 version of MapGuide Open Source, as that contains some important KML-related fixes. This post will use the Sheboygan dataset to demonstrate this.

Here's the Sheboygan Parcels layer in Maestro

Notice the KML Elevation button. It is disabled because the Layer Definition is based on an XML schema that pre-dates KML support. So to enable it, just upgrade the resource to the latest supported schema version. Clicking on the KML elevation button gives you this dialog

This dialog has 4 simple settings:

The Z offset: Defines the vertical offset distance

The Z extrusion: Defines the associated "height" of each feature

The Z offset type: Defines how to apply Z offset distance

The units: Defines what units to interpret the Z extrusion as

So for example, if we wanted to "chart" the land value of each parcel, we'd apply the settings like so:

1. Enable the elevation/extrusion settings

2. Click the [...] button beside the Z Extrustion to bring up the FDO expression editor

3. The Z extrusion can be any FDO expression that evaluates to a number, so we put in the land value (RLANDVC), but for practical display purposes inside Google Earth, we divide this value by 1000. Because if we used the default elevation units of "Meters", a million-dollar parcel of land will extrude into Low Earth Orbit :D

4. For Z offset type, we can leave this unchanged. You only want to use Absolute if you want the extrusion to be interpreted from sea-level. But since Sheboygan looks to be a pretty flat area, we'll leave it as RelativeToGround

5. Finally the units we'll keep as-is

The only other thing we'll change is the scale range back to the default of [0 to Infinity]. From my experience, Google Earth has a bit of trouble zooming to the region if the layer's constricted to a discrete scale range.

Also we'll change the layer's tooltip to also show the land value (RLANDVC). Notice the use of the refined concat expression eliminating the need to construct what I like to call "concat pyramids" in the past.

Layer Definition tooltips are transformed into popups when you select such features in Google Earth.

Once that's done save the layer definition and then house it within a Map Definition. We do this so that the Map Definition acts as a "funnel" into the WGS84 coordinate system (which features must be in for them to properly line up in Google Earth), any non-WGS84 layers we add will be automatically re-projected to WGS84 via MapGuide's powerful coordinate transformation facilities.

Once it's been housed within a Map Definition. We can invoke GetMapKml on the Map Definition through the mapagent to produce a KML document for that map. Opening the KML document in Google Earth and clicking the Parcels node, takes you to that familiar region

Which you'll notice is not as flat as the data you're used to seeing :) Because these parcels have now been extruded based on the FDO expression we've provided earlier (RLANDVC / 1000). Notice how dividing by 1000 has indeed capped the extrusion to a reasonable level for display purposes.

And if we tilt the view a bit, we can get a better view of the land value.

And if we select the tallest feature there, we find out that Leon B is sitting on a gold mine and that in general, land owners under Voting District 4 are filthy rich :)

The popup content is the same content that would be in a MapGuide tooltip if we were looking at this data via the normal AJAX viewer.

One thing to keep in mind here is that the KML support in MapGuide is still somewhat basic. There isn't much ability to fine-grain control the KML content that gets generated by MapGuide. If you have such requirements, you are better off serving KML from the GeoREST extension for MapGuide instead.

Speaking of which, we could certainly do with a 2.4-compatible release of GeoREST

Wednesday, 10 October 2012

Consider this fragment of code to create a MgMap from a Map Definition resource id

Here's a sample set of numbers from this code on a first run.

Notice how subsequent MgMap initializations from the same Map Definition after the first one are near instantaneous? This is because the Create() call caches frequently accessed information for future calls, like:

Feature Schemas

Class Definitions

Identity Properties

Spatial Contexts

Feature Source documents

Now for a map like the Sheboygan sample, these numbers don't mean much, but for a really chunky map that has 100s of layers, the effects of this cached information will be more apparent.

Now you may be wondering, is there a way like Map Tiles, to pre-cache such information (ie. Basically automatically MgMap.Create() any Map Definitions you specify)?

Yes there is, and it's called the PreCacheMap configuration property in serverconfig.ini

Set this property to a comma-separated list of Map Definitions, and the MapGuide Server will MgMap.Create() each one in the list as part of service startup, and pre-cache all the associated information that comes with the Map Definition and hopefully reduce that "cold start" lag.

Tuesday, 9 October 2012

If you're a .net developer, you'll probably be interested in the expanded array of .net development options for MapGuide.

For the final release of MapGuide Open Source 2.4, we've taken the NuGet packaging work that we've done for mg-desktop and extended it to also include the official MapGuide API. As a result, we had to make some modifications to the existing nuget packages to accommodate the official MapGuide API, and to stay under the 30mb package limit in the NuGet gallery.

There are now 5 different nuget packages (in x86 and x64 flavors, suffixed by -x86 and -x64 respectively):

mapguide-api-base

mapguide-api-web

mg-desktop-net40

mg-desktop-viewer-net40

cs-map-dictionaries (this package is CPU-agnostic and is not suffixed)

Whose dependency chain is like so

So based on the type of application you're trying to build, you have the following package configurations.

Building a normal MapGuide .net web application? Install the mapguide-api-web package, that will automatically install the mapguide-api-base pre-requisite. The full set of files in these two packages is the same set of files under the mapviewernet/bin directory that you've always been asked to copy over to your .net application in the past.

Building a desktop-based MapGuide application? Install the mg-desktop-net40 package, that will automatically install the mapguide-api-base pre-requisite. You will also need to have the CS-Map coordinate system dictionaries on hand in order to be able to use any of the MgCoordinateSystem classes in the MapGuide API, or you can install the optional cs-map-dictionaries package that contains a subset of the coordinate system dictionary files.

Building a desktop-based MapGuide windows application? Install the mg-desktop-viewer-net40 package, which will automatically install any upstream pre-requisites. Again, install the cs-map-dictionaries package if you require coordinate system dictionaries.

Building an application that only uses the shared MapGuide components (eg. MgCoordinateSystem)? Install the mapguide-api-base package, and optionally install the cs-map-dictionaries package.

The main benefit of the NuGet approach, is that you no longer fall to the rookie mistake of forgetting to copy over the unmanaged dlls to your application's output directory, as all these nuget packages will insert the appropriate post-build events into your project files to automatically do this for you!

Add a call to MapGuideApi.MgInitializeWebTier with the path to the webconfig.ini in the startup routine of your web application.

3 error prone steps (and probably the cause of most .net newbie questions) have been eliminated by nuget.

These nuget packages also include the relevant intellisense files to hopefully keep you away from the main API reference as much as possible :)

Now although NuGet greatly simplifies things, there may be some un-tested scenarios (eg. Continuous Integration) where this type of setup may not work perfectly. Another thing to note is that the .net assemblies in these NuGet packages are all signed whereas the equivalent assemblies that come with MapGuide Open Source 2.4 are not. If you cannot use signed MapGuide assemblies (I can't think of a reason why not), then nuget probably isn't the choice for you and you are better off sticking with the old fashioned way.

Coinciding with the recently released MapGuide Open Source 2.4, is a new release of mg-desktop.

This release "solves" the one remaining shortcoming of mg-desktop: The inability to display tiled maps. It turns out that in the process of wracking my brain trying to figure out how to fetch the required tiles given our current view extents and then stitch the resulting image together that I've sort of lost sight of the overall objective.

Ultimately, we just want to see the image of a map with tiled layers (don't we?). We don't necessarily have to do things exactly as how the AJAX and Fusion viewers do it if this is our main objective (which it is)

So in this respect, the solution has been sitting in front of me all this time. Simply use the RenderMap API, instead of RenderDynamicOverlay, as RenderMap includes the tiled layers as part of the final image at the small cost of the background color being pre-filled into the image, making the resulting map image fully opaque, and making the pre-rendering hook feature of the viewer useless as the map image will completely obscure whatever custom rendering you've done beforehand. Still, this is not the ideal solution (hence the "solves" in quotes) as we should really be using the Tile Service APIs to do this, but is one that is workable for now.

There are some new viewer properties to control this behaviour. All the relevant properties are listed here:

ConvertTiledGroupsToNonTiled - This is the existing viewer property that will convert all tiled layers in the map to dynamic ones, allowing these layers to be visible under default viewer parameters. This property remains for compatibility purposes.

UseRenderMapIfTiledLayersExists - This is a new viewer property that will instruct the viewer to use the RenderMap API instead of RenderDynamicOverlay, if and only if the map contains tiled layers. If you have no need to do pre-map rendering, this is the property you should be setting to true

RespectFiniteDisplayScales - This is a new viewer property that will make all zooms "snap" to the nearest finite display scale in the runtime map, just like it does in the AJAX and Fusion viewers.

Because we can now finally view tiled maps, the legend component has been updated to properly omit checkboxes for such layers

The second major change is that this release marks the first availability of mg-desktop in 64-bit. However, this will only be available for the .net 4.0 (VC10) build of mg-desktop. The .net 2.0 (VC9) build will remain 32-bit. For the .net 4.0 builds of mg-desktop in this release, we are using the same dlls as the official release of MapGuide Open Source 2.4 as the build process to make the MGOS 2.4 installer also made this release of mg-desktop. The only difference to note is that the .net assemblies in mg-desktop are signed whereas the ones in MapGuide Open Source 2.4 are not.

On the nuget front, we've tweaked the structure of the nuget packages, which I'll go into more detail in a future dedicated post (as it relates to the release of MapGuide Open Source 2.4 as well). The brief gist for this post is that the CS-Map coordinate system dictionaries is now an optional content-only nuget package that you need to install in addition to the mg-desktop nuget package if you require these files. Also the nuget packages are split off into x86 and x64 variants. So be sure to install the package that matches the bitness of your application.

Finally, there are also several performance improvements to improve startup time. You can find all the gory details in the changelog.txt that's now included with the zip distributions and nuget packages.

Having addressed the tiled map issue, this will probably be the last significant release of mg-desktop for a while. But should there be any new developments in mg-desktop, you'll be the first to hear it right here.

The main change from RC2 is a long-overdue fix in the AGG Renderer to properly preserve PNG8 transparency (with a revised PNG8 quantization algorithm borrowed from MapServer), meaning you can finally have transparent PNG8 tiles and dynamic overlays! Many thanks to Bruno Scott for this patch.

The second main change is the introduction of NuGet packages for the MapGuide .net API, which will be a detailed topic for another post (see here).

The third main change is with regards to Fusion. For the longest time, if you needed to modify or fix a production Fusion installation, there was no easy way to rebuild the fusionSF.js and fusionSF-compressed.js files that incorporates your fixes and modifications. With the 2.4 release, we now offer a zip file containing the build.xml and supporting tools needed to "re-compile" a new fusionSF.js and fusionSF-compressed.js. All that is required from your end is to have Apache Ant installed.

It's been a long journey to get to this point (a bit too long!). Here's hoping the road to 2.5 is a much shorter one.