If you didn’t catch the first two parts of this series, you can do that here and here. In this part, we’ll get a little more technical and use Microsoft Flow to do some pretty cool things.

Remember when we talked about the size and quality of the images we take with our PowerApp and store as the entity image? When saved as the Entity Image for a CDS/D365 item, the image loses quality and is no longer good for an advertisement photo. This is done automatically and as far as I can tell, the high-res image is gone once this conversion takes place (someone please correct me if I’m wrong on that!). On the flip side of that, it doesn’t make a whole lot of sense to put all this tech together only to have my end users be required to take two pictures of an item, one for hi-res and one for low-res. We don’t want to store a high-res in a relational database for 10,000 plus items because the database could bloat immensely.

Microsoft Flow and SharePoint to the rescue!

PRO TIP: Dynamics 365 will crop and resize the image before saving it as the entity image. All entity images are displayed in a 144 x 144 pixel square. You can read more about this here. Make sure to save/retain your original image files. We’re going to stick ours in a SharePoint Picture Gallery App.

Objective

Create a Microsoft Flow that handles…

Pulling the original image off the Dynamics record and storing it in SharePoint.

Setting the patch image to the Entity Image for the Dynamics record

Create an advertisement list item for the patch

Save the URLs for the ad and image back to the patch record

Create the Flow

We’re going to write this Flow so that it’s triggered by a Note record being created.

On the next page, click “Search hundreds of connectors and triggers” at the bottom of the page.

Select Dynamics 365 on the All tab for connectors and triggers.

Select the “When a record is created” trigger.

Set the properties for Organization Name and Entity Name. Entity Name should be “Notes”.

Save the Flow and give it a name.

Verifying a Few Things

Add a new step and select the Condition item.

The Condition should check to see if the Note has an attachment. We do this using the “Is Document” field.

In the “Yes” side of the conditional we want to check if the Object Type is a Patch (ogs_patch in this case).

At this point, if the Flow has made it through both conditionals with a “Yes”, we know we are dealing with a new Note record that has an Attachment and belongs to a Patch record.

Update the Patch Record

Now we want to update the batch record’s Entity Image field with the attachment. First we need to get a handle on the Patch record. We’ll do that by adding an Action to the Yes branch of our new Conditional.

Add a Dynamics 365 Update a Record Action.

Set the Organization Name, Entity Name, and Record identifier accordingly. For our Patch Record identifier, we’ll use the Regarding field in the Dynamic content window.

Click on Show advanced options and find the Picture of Patch field.

For the Picture of Patch field we need to get the document body of the attachment and convert it from Base-64 encoding to binary. We do this using the “Expression” area again. Use the “base64ToBinary” function to convert the document body like so.

Save your work! I can’t tell you how many times I had to retype that function.

Create Our SharePoint Items & Clean-up

Now that we’ve updated our entity image with the uploaded patch picture we want to do a couple of things, but not necessarily in sequence. This is where we’ll use a parallel branch in our Flow.

Dealing with a Parallel Branch

Under the last Update a Record action, add a Conditional. After adding this Conditional hover over the line between the Update action and the new conditional. You should see a plus sign that you can hover over and select “Add a parallel branch.”

Select this and add a Compose action. You may need to search for the Compose action.

PRO TIP: With Modern Sites in SharePoint, we now have three solid options for displaying images in SharePoint. The Modern Document Library allows viewing as tiles and thumbnails within a document library, the Picture Library which has often been the place to store images prior to the Modern Document Library, and then we can simply just display an image, or images, on a page directly.

Saving the Attachment as an Image in SharePoint

Let’s deal with Compose branch first. Our compose will have the same function as our Picture of Patch did above for the Input field. base64ToBinary(triggerBody()?[‘documentbody’])

After the Compose, we’ll add a Create File Action for SharePoint and use the name from our Patch record as the name for our image in SharePoint. I’m using a Picture Gallery App in SharePoint and for now, only using the .JPG file type. The File Content should use the Output from our Compose Action.

Delete the Note

Finally, we want to delete that Note from Dynamics (and the Common Data Service) so that the image attachment is no longer taking up space in our Common Data Service. Add a Dynamics Delete a Record Action after the SharePoint Create file action. Set the Organization Name, Entity Name, and use the Dynamics content for Note as the Item identifier.

Creating Our Advertisement

Let’s jump back to the new Conditional we added after the Update a record Action where we set the entity image.

Set the conditional to check for the Generate Advertisement field being set to true.

If this is true, add a SharePoint Create Item Action and let’s set some values. What we’re doing here is creating a new SharePoint List Item that will contain some starter HTML for a Patch advertisement.

Save our work!

Updating Our Patch Record With Our URLs From SharePoint

Under the SharePoint Create Item Action for creating the Ad, AND after the SharePoint Create file action for creating the picture in the Picture Gallery, we’re going to add Dynamics Update record Actions that will be identical with one difference.

The Organization Name, Entity Name, Record Identifier (set to Dynamic Content “Regarding”) should be the same.

On the Ad side, the Update record should set the SharePoint Ad for Patch field to “Link to Item”.

On the image side, the Update record should set the SharePoint Image for Patch to the “Path”

Seeing It In Action

Of course, I’ve been saving my work so let’s go ahead and give this a whirl.

At the top right of your Flow you’ll see a Test button. We’re going to click that and select “I’ll perform the trigger action.”

To make this more interesting, I’m going to run this from SharePoint! I’ll update a patch and kickoff my Flow from the embedded PowerApps Canvas App on my SharePoint home page.

I select the patch, then I click the edit button (pencil icon at the top right).

Notice the Attach file link and the Generate Advertisement switch. We’ll use the first for our image and the second for generating our ad item in SharePoint.

Finally, I click the checkmark at the top right to save my changes. This kicks off our Flow in less than a minute, and when we navigate back over to the Flow we can see that it completed successfully.

I’ll hop back over to SharePoint to make sure that my ad was created and my entity image was set. I’ll also make sure the high-quality image made it to the SharePoint Picture Library and the Note was deleted from the Patch record in Dynamics. I also want to make sure the URLs for the ad and image in SharePoint were set back to the Patch record.

One last thing: When we store the image in a SharePoint Picture Gallery App we can retain the dimensions, size, and quality of the original image, unlike when storing the image as a Dynamics 365 entity image. Check out the properties in the next screen shot and compare that to the properties on the SharePoint page in the same screen shot.

Conclusion

I hope you are enjoying this series and continue to tune in as the solution for our dad’s beloved patch collection grows. I constantly see updates and upgrades to the Power Platform so I know Microsoft is working hard on making it even better.

Part One: Identify, Define, Build, Migrate

My dad passed away in 2015, leaving behind an extensive collection of fire trucks, patches, and other fire department (FD) memorabilia. Before he passed, he gave us instructions to sell them and some direction on what to do with the money. After a few years of not really wanting to deal with it, my family decided to make a project out of it. My mom, sister, wife, two daughters, and I are working our way through thousands of patches, hundreds of fire trucks, and who knows how many pendants and other trinket like items, all while working full-time jobs (school for the kiddos) and from different locations.

Dad was great about logging his patches into a Microsoft Access database, but not so good about taking pictures of them, and even worse at logging his fire trucks and other items. The objective and high-level steps for this project were quickly identified.

Sister – Low-to-Medium – Arguably more advanced than mom, works on a Mac. Enough said.

Wife – Medium – Works around Excel with ease, understands what I do from a high level.

Kids – Low-to-Medium – two daughters, ages 12 and 10. Both are geniuses on any touch device but have no clue how to work around Excel.

Me – High – developer and technology enthusiast!

I’ve spent the better part of my career as a .Net developer working in SharePoint and Dynamics, among other things, so it was easy for me to decide on a path forward. Let’s get rolling!

Configure Data Schema and Migrate Microsoft Access Data

Just so no one thinks I’m lying here for the sake of this blog, let’s see what my dad was working with back in the day. Yes, he was ND alum.

Side note: You see that column named “Patch Locator” highlighted in that last screen shot? My dad kept his patches in old-school photo albums that he then stored in boxes. This ‘locator’ field was his way of finding the patch once a box was full and stored away. Genius dad!

As you can see defining the schema for patches was pretty much done. If we run into anything along the way, we can certainly add it.

In Dynamics I created an un-managed solution named “Fire Department Items Solution” and added two custom entities, “Patch” and “Fire Truck.”

I added all the fields my dad had in his Access database, and then I made sure that the out of box field “EntityImage” was available for displaying an image of the patch.

PRO TIP: Dynamics 365 only allows you to have one image field on an entity and it is not configured out of the box. To use this field, create a new field on your entity and use the data type “Image”. This will automatically set the name of your field to “EntityImage” and the image you set there will be used as your entity image at the top of the entity form.

Before we save and publish, we need to enable Notes functionality for our entities. To do this select the entity from the left pane in the solution explorer, then make sure the “Notes (includes attachments)” checkbox is selected.

PRO TIP: When you save an image to the EntityImage filed it loses a lot of its quality. Because we are using this data for inventory, including creating ads, we don’t want to lose the quality of our images. For this reason, we will use the attachments collection for our entity to capture the actual high-quality image. We will then use Microsoft Flow to take that image and store it as the EntityImage (which will lose quality) but also store the high-quality version in a SharePoint library.

Finally, be sure to publish your customizations.

Migrating the Data

Now it’s time to migrate the data. Since this was such a simple schema, I opted to use the out-of-box data import functionality that Dynamics 365 provides. With that said, however, there are a few different ways to accomplish this migration. For me it was easy to simply export the Microsoft Access database to Excel, then use that file to import into Dynamics 365.

Export your data into an Excel file from Microsoft Access.

In Excel you’ll want to Save a Copy and save it as a CSV file.

Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality to load our data.

Choose the CSV file we just created when we saved the copy in Excel.

On this next screen, let’s click the button to Review our Field Mappings.

Here you’ll see some of my fields are mapped and some aren’t. Let’s get those shored up before we proceed.

Now that I’ve resolved all the field mappings, you’ll see we have green check marks across the board and we’re ready to import. Click the Finish Import button and you’re off.

You can check out the progress of the import by navigating to Settings à Data Management à

Summary & Next Steps

Let’s look at what we’ve done here. On the surface it would appear we’ve simply gone into Dynamics 365 and configured a couple of entities. But as we know, Dynamics 365 v9 was built on the Common Data Service (CDS) and that means our Dynamics data is now available to any other application that can connect to the CDS. Why is this important for this project you might ask? That answer will become clear in the next part of this blog. For now, here are some screen shots on how things look now that we have our patch data migrated.

Keep in mind, come end of January 2019 everyone will need to switch over to Microsoft’s Unified Interface and that’s what we’re using here for our patches. This is an example of a model-driven PowerApp which we’ll discuss in our next entry to this blog.

If you log in to your PowerApps environment using the same credentials as your Dynamics 365 environment, you should see your entities and the data migrated in this environment too. Remember, once it’s in Dynamics, it’s available through the CDS.

One thing to note, if you have 10,000-plus records like I do for patches, CDS in the browser may freeze trying to display them all. I would hope MS resolves this at some point so that it handles paging and displaying of data as gracefully as the D365 web client does.

Stay tuned for my next entry where we’ll set up our SharePoint Online site, create a simple canvas PowerApp for inventory management on our mobile devices, and then set up a Flow to help move some things around and automate the creation of our online advertisements.

2017 was another great year overall here at AIS, and also marked the fifth anniversary of our blog! We hope you enjoyed reading and found our posts helpful and interesting. We’re all pretty passionate about what we do here, and look forward to sharing more thoughts, insights and solutions in 2018 and beyond!

As we close out the year, here are the top 10 most read and shared blog posts of 2017:

My decision to join AIS six years ago was a revelation. After almost seven years spent working as an embedded IT analyst for various government customers, I joined AIS to support a customer who was implementing SharePoint. I soaked up everything I could about this (at the time) brave new world of SharePoint. I loved it.

SharePoint 2003 had been available for use in my previous office where I had initially set up out-of-the-box team sites for working groups to support a large department-wide initiative. I found it empowering to quickly set up sites, lists and libraries without any fuss (or custom coding) to get people working together. Working with my new team, I gained insight into what we could do with this tool in terms of workflow, integration and branding. It got even better when we migrated to SharePoint 2007. We made great strides in consolidating our websites and communicating to those who were interested exactly what the tools could do in terms of collaboration and knowledge management.

This ability for a power user to quickly create a variety of new capabilities exposed a deeper customer need – easier communications with IT. While we had all this great expertise and firepower to create and maintain IT tools and services, our core customer base did not have an easy way to quickly and reliably communicate their needs in a manner that matched their high operational tempo. It was a problem. We needed a way for our customers to quickly and easily communicate with us in order to really hear what they needed to meet their mission goals and work more effectively. Read More…

The mission was critical, and the task complex: Ushering a print publication like Rolling Stone, a Bondi publication, into the digital age by providing them with a turnkey solution to present their print magazine archives online, for viewing on high resolution-connected devices of all shapes and sizes. Unlike other digital versions of magazines on the market, the new platform would allow the publisher to monetize its own unique brand through the years.

The Challenge

AIS needed to address multiple technical areas to provide the most viable solution for Rolling Stone and other archives.

Speed

Scale

Security

Continuous Delivery

High-Definition Presentation

Rolling Stone’s existing solution (‘print to digital’) only supported 100 issues, and AIS was challenged to multiply the output tenfold while reducing management costs and allowing Rolling Stone the freedom to design their interface in a way that matched and enhanced their other digital presences.

Solution

AIS implemented a multi-tier solution in the creation of the Bondi Archive Platform. The platform we built consists of a flexible, scalable website architecture using HTML5, JavaScript, ASP.NET MVC and SQL Azure. The system allows viewers to see exact replicas of the original print issues, something not offered by any other platform. Users can navigate using a mouse, keyboard or touch and can zoom in or conduct complex searches. Bookmarking inside the archive is available as is print, based on the publisher’s preference.

Results

The new archive platform has allowed Rolling Stone to join the online revolution and bring their print content online in its original format and context, thereby retaining copyrights. By serving the content directly from their own website or from the cloud, the company can avoid content restrictions and fees imposed by third-party aggregation platforms and app stores. The project and platform has been an unequivocal success. Rolling Stone chose to enhance its print subscriptions by offering the full digital archive at no extra charge. By tightly integrating their online content with the digital archive, a deeper level of interaction with readership has been realized.

I vividly remember the iconic scene from the 1995 box office hit Apollo 13 where a team of NASA engineers gathered around a table with a collection of mishmash spaceship junk. From this collection, the team had to create a square air filter to fit in a round receptacle so that the astronauts would not asphyxiate on CO2 in space. It’s an intense, life-or-death scenario of literally making a square peg fit in a round hole, where “failure is not an option.”

Working as a business analyst for our federal government clients means that budget, time, and resource constraints almost always play major role in any development effort. This challenge requires our team to use bit of ingenuity and a mixed bag of tools to create a solution for our customers. Read More…

Workflow in SharePoint 2013 has undergone quite the architectural change from its SharePoint 2010 ancestor. I documented many of the major changes in a previous blog post, “What Changed in SharePoint 2013 Workflow? Pretty Much Everything.” While SharePoint 2013 is backwards-compatible with SharePoint 2010 workflows, you may decide that the benefits of the new design are needed. The purpose of this post is to illustrate the new considerations you’ll need to keep in mind when targeting SharePoint 2013 workflows. The SharePoint 2010 project we’ll use for this example is the one from my very first AIS blog post, “Developing Multi-Tiered Solutions for SharePoint.”
Figure 1 - Sample WebAPI methods for Section Document Merge and Post-Merge Actions.

In our example project there are actually two workflows, SectionDocumentApprovalState (SDAS) and MasterDocumentApproval (MDA). The MDA checks if the various SDAS-related sections have been merged and finalized, then notifies specific users for approval of the final document. An instance of SDAS is created for each section, created from the Master Document that monitors the editing and approval of the specific section. We’ll focus on just the SDAS workflow. In the previous post, I referred to the workflows as being part of the Presentation Layer and the custom code called into the Business Layer. Both of these layers will change in a SharePoint 2013 workflow solution.

Our work with Rolling Stone and Bondi Digital Publishing is yet another example of how AIS can develop technology that creates new revenue streams for publishers. We built a digital distribution platform to usher print publications like Rolling Stone into the digital age – by providing them with a turnkey solution to deploy print magazine archives online for viewing on desktops, laptops and mobile devices. For Rolling Stone, the initial launch included more than 1,000 issues from 1967 to the present.

In this blog I’ll discuss some post-release reporting issues that we faced for one of our projects and the solutions we implemented. On the technology side, we had SQL Server 2008 R2 and MVC 4.0 application (which were hosted in Amazon Web Services) in our production environment.

The Problem

The post-production release reporting system was not responding as per the user expectations. For most of the high-volume reports (50K rows to 200K rows in report output), we were getting request timeout error. Client SLA for response time was two minutes; hence any report (big or small) must return data within two minutes. All the reports were designed using SQL Server Reporting Services 2008 R2. In all there were close to 40 reports with such timeout issues. Read More…

I recently completed a large document management system on SharePoint 2010 that used FAST Search and claims-based authentication. The client wanted to secure and limit access to customer-specific documents based on data coming from their CRM system.

We decided to implement a custom claim provider that would query the CRM system at login for customer claims based on the user ID. On upload (based on the customer that was assigned to the document), we used the content organizer to route the document to the correct site, library and folder based on the organization and security rules that we had. Each library had a claim for the customer assigned to it so only users with that claim could view the documents in the library. We would use search for the UI so that the users had a single place to find and view the documents. Sounds simple, right?

It should’ve been.

Unfortunately, the implementation was anything but simple. From the beginning, we hit the core limits of SharePoint 2010, FAST and Claims. Now that we’ve made it to the end, I want to talk about the limits we ran into and steps you can take in your design to avoid them. Read More…