Let us take a scenario, where a data migration package is running, and it either creates or updates (or deletes) a large number of records into Dynamics 365, and we want to get the count of records created/updated/deleted in the last x hour or so.

With views, we are limited to just 5000 records.

One option is to write the console app having the required QueryExpression or FetchXML condition using the Dynamics 365 SDK to get the count.

We can also make use of SSRS reports here.

Create a report using the report wizard, specify the criteria

In the LayOut fields window, specify Count as the summary type for grouping.

Run the report to get the count.

Another option that we have used the most is to use the FetchXMLBuilder plugin to build the query, copy it.

Add the Data Spawner component to the Data Flow along with the CDS Destination component in the integration service project.

Double click the Data Spawner to open the editor.

Click on Add + button to specify the columns, here we have specified four different columns.

We have kept the name for each of the columns, same as the schema name so that it is easy to map them in CDS Destination.

For the First Name column, we have specified Data Type as nvarchar and Spawn Type as the First Name, which will generate the string similar to first name value.

In Gender property for the First Name column, we can specify either to generate Male or Female first name.

Random will generate both Male and Female first name.

For the email address field, we have selected Spawn Type as email (personal), the other option is email (business).

For our option set field preferred contact method, we have selected data type as an integer and Spawn Type as Custom, which will allow us to specify the list of available values, which is 1 to 5 in our case.

We have specified the total number of records to be generated as 100000.

Lastly, we have set the output of Data Spawner to the CDS Destination component. (use the Map Unmapped fields to auto map the fields as we have set the column name same as the schema name of the attributes)

We can get the different metrics about the usage of the platform like active user usage, the operation performed, the entity used, plugins and API statics, etc. through Command Data Service Analytics (formerly Organization Insights).

Customer 360 View is more often than not, the starting point of the data migration discussions. Storing all the data in Dynamics 365 might not be a good idea as it could impact the performance and more importantly, the storage has a cost associated with it.

Setting up Azure subscription and storage account (for which customer needs to pay along with the Dynamics 365 Subscriptions) and provision staging sandbox environment.

Taking the backup of the CRM On-Premise database and uploading it to Azure Blob Container.

The backup is restored in the same version in Azure Hosted Virtual Machine i.e. CRM 2015 database will be restored in CRM 2015 VM.

PowerApps Checker service runs as part of the next phase, to validate the solution.

Finally, the Upgrade process starts, which upgrades the restored CRM Database to CRM 9.0 Database. E.g. CRM 2015 Backup will be restored in CRM 2015 VM, which will be then upgraded to CRM 2016 VM, and then finally to CRM 9.0 VM.

This is followed by user-mapping and the migration to the online as the last step.

Here, the customer/partner needs to make sure to transform all SQL Based report to Fetch XML, update the plugin to run in sandbox mode, update JavaScript to the new client object model, check for any 3rd part tools and reconfiguration of integration, make users familiar with new Unified Client interface, cleaning of the database (for audit data), etc to make the solution online compatible. This can be done before restoring to the Azure Storage or during the staging phase (This is customer/ partner responsibility and will be assisted by FastTrack Architect and Microsoft Technical Support team for any upgrade issues).