Exchange 2003 – Active Sync reporting

We have received a lot of feedback recently about your need to better understand the usage of your Exchange 2003 ActiveSync (EAS) deployment. With all of the great features built into EAS, it’s important for you as an IT Professional to know the volume and usage patterns of your deployment. This information can help you more effectively manage your EAS deployment, better understand user productivity, and help you calculate ROI.

Today, the EAS protocol, logs a lot of data in IIS about what’s going on, but parsing out that data can be a daunting task. We understand your pain, so to help you, we have created two sql scripts which will help you get started in better evaluating your EAS deployment. These scripts work in conjunction with the Microsoft Log Parser 2.2 tool to produce informative reports.

DISCLAIMER: Those scripts are samples only, not officially supported by Microsoft.

The first script produces a ‘Hits by User’ Report which will show you which users are hitting your EAS servers. This report provides many of the salient statistics that could help you better understand how your users are interacting with your EAS deployment. It returns information by individual user such as, devices type used, number of Syncs, Emails Sent, Attachments downloaded, Meeting Responses, etc.

The second script produces is a ‘Hits by DeviceType’Report. This report gives you a breakdown of the DeviceTypes (SmartPhone, PocketPC, etc.) that are hitting your EAS Server. This report can give you a better idea of what types of devices your users prefer, which can help you make more informed planning decisions in the future. This script also can produce a very helpful chart, such as a 3D pie chart using Microsoft Office Web Components that can be used for reporting or presentations purposes to your management.

Here are the details for running Log Parser with these sql scripts…

NOTE: Your IIS machine should be setup to produce “W3C Extended Log File Format” logs. IIS set this format type by default format, but if you changed this format type please reset it in order for the sql queries to find the correct field names. You can change this on the properties of your website via IIS Mgr.

By default, log parser will be installed in C:\Program Files\Log Parser 2.2. Run the LogParser.exe from the Log Parser install directory. Examples of the full Log Parser syntax are included in the last section of this blog – “Running Log Parser with the Supplied SQL Scripts”.

Log File Location

You can either specify the UNC location of your Logs, or copy them locally to the machine running Log Parser. Mapped drives do not currently work in Log Parser 2.2.

SQL Queries (samples)

Hits by User Query

Create a file and name it Hits_by_User.sql and place this code block into the newly created file.

With EAS, most of the salient data that will give you valuable reports are stored in the cs-uri-query field. The EAS protocol logs each HTTP Post request and response, such as sending email, downloading attachments, etc. Here in this example, we first parse the cs-uri-query field to look for the ‘DeviceType=’ section in this field (cs-uri-query) and retrieve the value that is between the ‘DeviceType=’ and the ‘&’ (single quotes excluded). Next we go after the different types of sync commands by using the same mechanism that we used to get the DeviceType. To do this we look for the ‘Cmd=’ section and find the value that follows. The parsed out commands are then stored into the MyCmd variable. In order to display the various sync command stored in the MyCmd variable into individual fields we need to place them into their own variable and SUM up the variable to get the total of each sync command. We use the CASE function to evaluate the incoming data of the MyCmd field. In our example, we returned some of the more interesting commands, but there are more commands to report on if you’re interested. See the below list of 'Sync Commands' which you could add to this sql statement...

Finishing up on the details of this first sql statement we select the output format. In this case we place the results into a CSV file which you can quickly view, or you could import the CSV into a SQL reporting solution. In the next sql query example, we’ll show you how to create a 3D pie chart of the results using Microsoft Office Web Components.

Modifying/Customizing this sql statement

We know that you may have unique reporting requirements and would like to have a report that only returned certain data points. You can use the above sample as a starting point to add or remove fields to have the most appropriate report for your needs. For example, if you wanted to have a report that just showed the top 10 users who have downloaded the most attachments you could modify the query to something like this…

This list is the comprehensive list of sync commands. Depending on your Exchange Server version and the devices that connect to your server, you may not have these commands return data. For example, the Ping command was recently introduced in the Exchange 2003 SP2 release, so if you have a longer version of Exchange you may not get this information returned back in your report.

Sync

FolderSync

GetItemEstimate

Ping

Search

GetAttachment

SmartReply

SendMail

MeetingResponse

ItemOperations

Notify

MoveItems

Settings

SmartForward

GetHierarchy

Provision

Hits by DeviceType Query

Create a file and name it Hits_by_DeviceType.sql and place this code block into the newly created file.

In this example we use the same mechanism to parse into the cs-uri-query field to get the DeviceType value. We then group and count the different Device Types. The interesting thing about this example is that the results fit nicely into a chart, such as a pie chart. Log Parser has hooks built into the Microsoft Office Web Component API which will allow you to produce relatively sophisticated charts through the Log Parser tool. We’ve provided you with the option to produce either a CSV or a Chart. Just comment out (using /* COMMENT */) the output type that you do not want.

Running Log Parser with the Supplied SQL Scripts

To generate reports using the sample sql files that you created above open, type in (or copy & paste) this command into the command prompt that you opened up above.

This is exactly what I’ve been looking for. Although there is one aspect I would love to add if possible which is the time of last sync. Since many of the users in this list could be gone from the company now, it would be good to know the time of the last good sync of each user. Is this easily added to the .sql file?

to the list of columns returned. That will give you the date of the last request sent, which you could use to figure out who left the company (or threw their phones in a lake). It gets a little trickier to do for the last "good" sync. I would probably revert to some perl script to do that, although I’ve seen some pretty clever uses of SQL statements, so maybe it would be possible to do it all in one statement.