Over the past few weeks, people have been demonstrating some cool games, built entirely using PowerApps. The reason being, that there was a contest from ThoseDynamicsGuys. If you are interested in finding out about some interesting games from this contest, watch this video from Mr.Dang himself -> https://www.youtube.com/watch?v=0-ZWqs_emQA where he reviews the games.

I was especially fascinated by what Scott and Makato accomplished with the side scrolling nature of the games, how Geetha managed to rotate the roulette as there is no rotate angle property on images and how Nagao managed to calculate the trajectory and resistance of the ball. The first step to learning is to understand how other people do it, so I spend 4-5 days to understand how these apps have been developed.

I then wanted to develop something of my own, using the concepts that I had learnt. Spirograph was the first thing that I thought of. The easiest part was to get the equations to calculate x and y. I had to then learn the basics of starting and stopping a timer, and how to create a sense that the pattern was being “drawn”.

Since image control can render SVG, I decided to try this approach. My first challenge was how do I calculate the x and y on every tick of the timer. So, I decided to follow the approach demonstrated by Brian Dang, that involve checking and un-checking a toggle control.

Every time the toggle is checked, I can increment the iterator variable and calculate x and y for the line to be drawn and the string for the SVG path with all the lines. In SVG you can draw a line, by moving to a location specifying Mxy and then drawing the line using Lxy. Below is the formula in the OnCheck event of the slider

I can then simply set the SVG path’s d property from the PathVar variable, that has all the line co-ordinates. Below is the value for the Image property of the Image control that renders the Spirograph.

Below a video of the app in action. You can play around with the sliders and be fascinated by the patterns that it generates.

Since MVPs are now announced every month, I have found it hard to track down the new awardees in Business Applications area before 3rd of every month. So, I thought I will build a notifier using Flow and Functions. This is the logic:

Seed the inital MVP data from mvp.microsoft.com, to figure out the new MVPs every month into Azure Tables

Schedule the Flow at 1st of every month at 5 p.m. PDT. Hopefully by this time, everyone has filled out atleast their name in the MVP profile.

Retrieve the MVP data again from mvp.microsoft.com

Figure out the new MVP and post a message to Slack. Add the new MVP details to Azure Tables

The logic that retrieves the details from mvp.microsoft.com uses Azure Functions. Below is the project.json for that Function App.

It uses the AngleSharp library to parse the response returned by mvp.microsoft.com search.

Below is the Flow that uses this Function, to populate the Azure Table.

As you can see below the Flow runs at 5 p.m on 1st of every month, and gets the current MVP list from Azure Table, MVPAwards.

The partion key for the table is month and year and the row key is the actual MVP Id.

Since this Flow and the associated Function were built for my personal use, I call the Function directly without using a custom connector. I pass the award category on the URL of the Functions itself. If you want to use a custom connector, refer one of my earlier posts.

After parsing the JSON returned by the Azure Function, I can use the “select” function to map the returned data, so that I can compare the data based on the MVP Id and insert the new MVPs into the Azure Table, if needed. The partion key will the current year and month (yyyyMM).

Next is the crucial bit, where I do the actual comparison.

The “filter” function can be used to see whether there are any matches in MVP list that was retrieved from Azure Table i.e. the previous month MVPs. If no results were returned, that means they were not a MVP last month. So, they are a new MVP and this can be stored in Azure Tables and can also be posted to Slack.

I have been looking into scenarios with PowerApps and Flow that can benefit Dynamics 365 Customer Engagement user experience. One of the scenarios that can add value right away is on the entity forms. PowerApps can be embedded as an IFrame on the normal entity forms, and can be used similar to Dialogs to offload some processing to PowerApps and Flow.

Here is the finished product.

This works without any JavaScript at all in UCI, with the normal IFrame control on the form. Make sure to tick the option that passes the record id and objecttype code and untick cross site scripting restriction.

No scripts are needed on the form to embed the PowerApp.

Once PowerApps is in place, the current form context can be inferred using the “id” parameter that is passed on to the form.

I use a known Guid during the design phase to assist me with the app design process, as the PowerApps calls the Flow during the OnStart event and sets the ProblemDetails variable.

A Flow can be associated to an event, from the Action->Flows area.

When the PowerApps loads, it calls the Flow with the Guid, to retrieve the case details. The Flow that responds to PowerApps with these details on the case: Title, Customer Name, Type of Customer (Account or Contact).

In this Flow I just use “Respond to PowerApps”action and return the three outputs.

I used variables to store the Client, which could be the Account’s name or Contact’s FullName, depending on what is on the case. The client type could be either Account or Contact. Account Details and Contact details are retrieved based on the result of the Client Type branch.

For the second Flow, the user presses the “Check” button which performs some additional checks based on business criteria. For this Flow, I used the “Response” action, which allows me to return JSON results. I stored the cases I am interested in on an array variable.

From the variable, I used Select action to grab only the properties I am interested in.

I can then use the “Response” action to return these to PowerApps.

One weird thing that I encountered in PowerApps/Flow integration, is that I would simply see the result as “True” from Flow, when I tried to return the return the response straight from the variable.

When I used Parse JSON and then Select to reduce the properties it started working. This can happen when there is something wrong with the schema validation, but I am not sure how this can happen when I copy-paste the JSON response from the previous steps to auto-generate the schema.

One more thing: When the Flow that is associated with the PowerApps changes, just make sure to disassociate and reassociate the Flow. I had issues when I did not do this, due to PowerApps caching the Flow definition.

After my previous post, I continued to explore virtual entities to see what real life problems I can solve using them. One problem I could think of was metadata. How awesome would it be, if I can use Advanced Find to query entity and attribute metadata, or visualise them as a normal entity! It is not a dream anymore. I have developed an open-source solution to do this.

Here is some of the sample queries:

Query all attributes of type customer

Query all mandatory attributes

Query by Attribute Type and Entity

Query all Many to Many intersect entities

Query entities that have quick create enabled

You can open the entity and see more details.

You can open the attribute and view more details as well.

All this is awesomeness is possible using the power of virtual entities. There are two virtual entities that you can query. They are called Entity and Attribute.

After importing the managed solution, change the data source for the attribute entity from “Entity Datasource” to “Attribute Datasource”. You have to do this from the Customization area and not from the managed solution.

This is because by default, the system does not allow relationships between two virtual entities that have different datasources. This exception is shown when you try to do this.

In order to workaround this exception, you keep the data source same for the “Entity” virtual entity (parent) and “Attribute” (child) virtual entity, create the relationship and then change to the right datasource. Hence, the managed solution has the datasource set to “Entity Datasource” for the “Attribute” virtual entity, which has to be changed manually after importing the solution.

I hope this solution would be really useful for administrators. Please let me know any feedback on the post or on GitHub issues.

Gotcha 2: If you don’t want the user the open up an individual record, you don’t have to implement Retrieve message. It is optional. Since, I just want a collated entity, I did not register any plugin for the Retrieve.

Gotcha 3: You have to open up the newly created Data Provider entity, and enter the external name. If you don’t enter this, you will be unable to create the data source, as it will always error out.

Objective: MRU items should be accessible from Advanced Find. As an Administrator, I would like to query this data, and see metrics around user participation, entity usage, activity by week/month/year etc.

This is the Most Recently Used area.

This is the Advanced Find on the virtual entity, which is driven by the same data.

As you can see, the data matches up. All the heavy lifting is done by the plugin, that retrieves the records from the “UserEntityUISettings” entity, parses the XML, sorts by user and accessed on and then populated the virtual entity “Recent Items”.

You can query by “Type Equals”, “User Equals” and “User Equals Current User”.

I can also do a PowerBI report that is driven by the same virtual entity.

In the previous post I described how easy it is to use Microsoft Flow to interact with Dynamics 365 Customer Engagement, by letting Azure Functions handle the core logic. In this post, I will show how to integrate Slack to Dynamics 365 Customer Engagement using Flow and Functions.

This is the objective: In my Slack channel, I quickly want to query the record count using a slash command, without have to jump into XrmToolBox/Dynamics 365 Customer Engagement application itself. I took the record count as a simple use case. You can create multiple slash commands, with each one doing a different targeted action in Dynamics 365.

Since this is an internal app that I won’t be distributing, I am choosing a simple name. If you plan to distribute this app, choose a more appropriate name.

Now you will be taken to the app’s initial config screen.

We will be creating a new slash command that will return the record count of the entity from Dynamics 365 Customer Engagement. Click on “Create a new command”

Choose the name for the slash command. I am just going with “/count”.

The critical part here is the Request URL. This the URL that Slack will POST to with some information. What is the information and how does this look like? I used RequestBin* (see footnote) to find out this information.

Note the two relevant parameters:

command – This the actual slash command the user executed

text: This is the text that comes after the slash command

For e.g., if I typed “/count account” into the Slack chat window, the command parameter’s value will be “/count” and the text parameter’s value will be “account“. During the development phase, I put in the RequestBin’s URL in the Request URL. We will come back later, once the Flow is complete and replace this placeholder URL, with the actual Flow URL.

Now you can see the list of slash commands in this app.

Now click the “Basic Information” screen on the left, and then on “Install your app to the workspace”. This should expand the section, and you can now actually install the app into your workspace by clicking on “Install App to Workspace”.

Grant the required permissions for the app.

Now it is time to develop the Flow, which looks very similar to my previous post about Flow and Functions. The difference here is, that the Flow is triggered by HTTP POST, and not manually using a Flow button. Flow will receive the slash command from Slack. Here is what the Flow looks like.

Here is what the Flow does:

When HTTP POST request is received from Slack, it posts a message back to Slack asking the user to wait while the record count is retrieved.

Checks if the slash command is “count”

If the slash command is “count”, call the Azure Function using the custom connection (refer previous post, on how to do create a custom connection to the Azure Function that you can use in Flow)

Parse the response received from Azure Function, which queries Dynamics 365 Customer Engagement for the entity’s record count

Send a mobile notification that shows up if the user has Flow app installed

Send a message back to the channel that the slash command was executed on, with the record count

There are three important bits in the Flow:

The first is getting the slash command from the POST message.

The second is posting into the right Slack channel i.e. the channel that was the source of the slash command. You can get the channel from the “channel_name” parameter.

The third is parsing the JSON returned by the Azure Function. This is schema of the JSON returned.

You can get the Flow URL by clicking on the HTTP step that is the first step of the Flow.

Grab the whole HTTP URL and plug it in on the slash command’s request URL.

Now, you can use the slash command on your workspace to get the record count.

Note: When I worked on this post last month, RequestBin had the capability to create private bins. But, when I looked into this again this week it looks like they have taken away this capability, due to abuse -> https://github.com/Runscope/requestbin.

You would have to self-host to inspect the POST message from Slack. The other option is to create the Flow with just the HTTP request step and look into the execution log, to see what was posted like below.

I haven’t paid much attention to what is happening in the Azure space (Functions, Flow, Logic Apps etc.), because I was under the impression that it is a daunting task to setup the integration i.e. Azure AD registration, getting tokens, Auth header and the whole shebang.

As a beginner trying to understand the stack and how to integrate the various applications, I have been postponing exploring this due to the boiler-plate involved setting this up. But then, I read this post from Yaniv Arditi: Execute a Recurring Job in Microsoft Dynamics 365 with Azure Scheduler. Things started clicking, and I decided to spend some days exploring the Functions & Flow.

I started with a simple use case: As a Dynamics 365 Customer Engagement administrator, I need the ability to do some simple tasks from my mobile during commute. Flow button fits this requirement perfectly. The scenario I looked into solving is, how to manage the Dynamics 365 Customer Engagement trace log settings from Flow app on my mobile, in case I get a call about a plugin error on my way to work, and need the logs waiting for me, when I get to work.

As I wanted to get a working application as fast as possible, I did not start writing the Functions code from Visual Studio. Instead, I tested my code from LINQPad as it is easier to import Nuget packages and also get Intellisense (Premium version). If you want to do execute Azure Functions locally, read Azure Functions Tools for Visual Studio on docs site. I did install and play with it, once I got completed the Flow+Function integration. Also, when you install the Azure Functions Tools for Visual Studio you get the capability to run and debug the functions locally. How awesome is that ❤️!

There are two minor annoyances that I encountered with Visual Studio development locally:

There is no Intellisense for csx files. Hopefully this will be fixed soon. The suggested approach in the mean time appears to be “Pre-compiled Azure Functions“. But, I did not try in this exploration phase. It also improves the Function execution time from cold start.

I had to install the Nuget packages locally using Install-Package even though these were specified on project.json. I could not debug the Azure Functions locally without this, as the Nuget restore did not seem to happen automatically on build.

Now, I will now specify the steps involved in creating the Azure Flow button to update the Trace Log setting in Dynamics 365 Customer Engagement.

Step 6: Choose the Platform as 64-bit. I got compilation errors with CrmSdk nuget packages when this was set to 32-bit. You will also have to add the connection string to your CRM instance. The connection string name that I have specified is “CRM”. You may want to make this bit more descriptive.

Step 7: Now is the exciting part. Click on the “+” button and then click on the “Custom Function” link.

Step 8: This new function will execute on a HTTP trigger and coded using C#.

Step 9: After this, I sporadically experienced a blank right hand pane with nothing in it. If this happens, simply do a page refresh and repeat steps 6-8. If everything goes well, you should see this screen. I left the Authorize level as “Function” which means that the Auth key needs to be in the URL for invocation.

Step 10: You are now presented with some quick start code. Click on the “View Files” pane, which is collapsed on the right hand side.

Step 11: Click on “Add” and enter the file name as “project.json”

Step 12: Paste the following JSON into the “project.json” file and press “Enter”. Now paste the below JSON for retrieving the CRM SDK assemblies from Nuget and press “Save”. The Nuget packages should begin to download.

Step 14: You can now execute this function by click “Run” and using the JSON in the screenshot for the POST body. The “traceloglevel” can be one of three values: Off, Exception and All.

As you can see the function:

Connected to the organization specified in the Application Settings using the connection string

Retrieved the current trace setting and updated it, if there is a change, using the SDK

Returned the response as text/plain.

If you want to execute the same using Postman or Fiddler, you can grab the Function URL as well. Note that the AuthToken is in the URL.

Step 15: Since I am going to do an update, I don’t want the function call to trigger the change. So, just turn off “GET” and save. This means that the “traceloglevel” will only be updated on a “POST”, and not on a “GET” with query string.

Step 16: Now it is time to export the API definition JSON for consumption by Flow.

Step 18: Now click on the “Authenticate” button and enter the function auth key (see Step 14) in the API Key textbox and click on “Authenticate” button in the dialog box.

You should see a green tick next to the apiKeyQuery. This means that the key has been accepted.

Step 19: Now it is time to add the post body structure to the Swagger JSON. I used the Swagger editor to play around with the schema and understand how this works. Thank you Nishant Rana for this tip.

You should now able able to POST to this function easily and inspect the responses.

Step 20: Now click on the “Export to PowerApps+Flow” button and then on the “Download” button. You should now be prompted to save ApiDef.json into your file system.

Step 21: Now it is time to navigate to Flow

Step 22: You can now create a custom connector to hookup Function and Flow.

Step 23: It is now time to import the Swagger JSON file from Step 20. Choose “Create custom connector” and then “Import an OpenAPI file”. In this dialog box, choose the Swagger JSON file from Step 20.

Step 24: Specify the details about the custom connector. This will be used to later search the connector when you build the Flow.

Step 25: Just click next, as the API key will be specified on the connection, not on the connector. The URL query string parameter is “code”.

Step 26: Since I just have only “ModifyTraceLogSetting” action, this is the only one that shows up. If you have multiple functions on the Functions app, multiple operations should be displayed on this screen.

Step 27: If you navigate down, you can see that the connector has picked up the message body that is to be sent with the POST.

Step 28: If you click on the “traceloglevel” parameter, you see see details about the POST body.

Step 29: This is the time to create the connection that will be used by the connector.

Step 30: Enter the Function API key that you got from Step 14. This will be used to invoke the Function.

Step 31: The connection does not show up straight away. You will have to click on the little refresh icon that is to the right of the Connections section. You can now test the connection by clicking the “Test Operation” button, and choosing the parameter value for “traceloglevel” that will be sent with the POST body. You can also see the live response from the Function on this screen.

Step 32: Once you have saved your connector, you will see something like this below, on the list of custom connectors.

Step 33: Now is the time to create the Flow. Choose My Flows -> Create from blank -> Search hundreds of connectors and triggers

Step 34: Enter the Flow name and since this will be invoked from the Flow app on mobile, choose “Flow button for mobile” as the connector.

Step 35: The Flow button will be obviously triggered manually.

Step 36: When the user clicks the Flow button, it is time to grab the input, which in the case will the Trace Log Level setting. Choose “Add a list of options” and also a name for the input.

Step 37: You don’t want the user to enter free-text or numbers, hence you present a list of options from which the user will choose one.

Step 38: After clicking “Add an action”, you can now choose the custom connector that you created. Search and locate your custom connector.

Step 39: Flow has magically populated the actions that are exposed by this connector. In this case there is only one action to modify the Trace Log setting.

Step 40: In this step you don’t want to choose a value at design time, rather map the user entered value to the custom connector. So, choose “Enter custom value”.

Step 41: The name of the input in Step 37 is “Trace Level”, so choose this value as the binding value that will be used in the custom connector.

Step 42: In this case, I have a simple action. I just want to receive mobile notification.

Step 43: I just want to receive a notification on my mobile, since I have Flow app installed. When my custom connector calls the function that updates the trace log level, the response text that is returned by the function comes through on the body on the Flow app.

This text is displayed as a notification. If you have a JSON returned by the Function and Flow app, you have to use the parseJSON manipulation to grab the right property. In this case, it is not required as the response is plaintext.

Step 44: When the Flow design is complete it should look like this.

Step 45: You can run the Flow from either the Flow app on mobile or right from here. I click “Run Now” to check if everything is OK. You can also specify the “Trace Level” here that will be passed to the Function.

Step 46: I can check the status of the Flow easily. The cool thing about this screen it that it logs so much information that is useful while troubleshooting what went wrong.

I can also invoke this Flow on my mobile, using the Flow App. I get a native notification when the Flow completes.

What’s next

While I was experimenting with Flow and Function, I wanted to test integration between Slack and Dynamics 365. For proof of concept, I am running a custom command (“/recordcount”) on Slack channel to retrieve records from Dynamics 365.

I will blog about this next.

Conclusion: I am really excited about the future of Flow & Functions and what this brings to the table for both developers, who want to get their hands dirty and power-users, who want something that they can hook up easily without writing any code.

If you have any feedback, suggestions or errors in this post, please comment below, so that I can learn and improve.