Using MFT Cloud Service to Automate HCM Batch Uploads – Part 1

In this blog, we will use Oracle Managed File Transfer Cloud Service to perform batch loads into Oracle HCM Cloud.

Use cases for this include updating Employee data from external systems e.g. recruitment, uploading employee pictures from a badge creation system or syncing data while running hybrid with PeopleSoft.

About MFT

MFT is Oracle’s strategic platform for batch file transfer in all our SaaS services (and more). It provides cool capabilities such as zip/unzip, splitting large files, PGP encryption, retries, embedded FTP/SFTP services that are HA-clustered with user credentials that can be stored in your external Identity Management System. All offered through a dashboard interface.

In short, no more messing around in command-line, OS-level scripting tools to manage your file transfers!

MFT also provides a flexible callout extension framework to trigger downstream activities after a file has been transferred. This is important in the case of the HCM.

About HCM Data Loading

HCM uses a concept called HCM Data Loader to perform batch load. This can be performed in either a user front-end through a function called “Import and Load Data” or through Webcenter Content and SOAP interfaces.

Select [Sources] from the left menu and created one using SFTP Embedded.

Note, In a situation where you are performing a HCM Extract, you would use WebCenter as a source then poll the WebCenter system for new files to retrieve and drop into an external system.

Browse for Available Folders. We are using the /hcmload folder which has been assigned to the hcmload user. Multiple users can use the same folder (or vice-versa).

You can also add a polling schedule to minimize the time window to check file or pre-processing actions such as encryption, zipping or only certain files you wish to monitor. After this select [Deploy].

If you’re interested in the design and registration of the Post-Processing Callout, head over to Part 2 of this series. In the meantime, you can grab this sample code for reference developed by my esteemed colleague Pandurang Parwatikar.

When a file is uploaded using the MFT’s WebCenter method, the name of the file its ContentID. This is automatically picked-up from the environment parameters by the Callout.

In addition, we would like to upload photos for those employees. Put that in the zip’s subfolder called /BlobFiles with the titles of those pictures included in the PersonImage fields of Worker.dat.

For ease of Person Search, modify all the “TEST01” values to something that suits you.

You can go to [My Workforce] -> [Person Management] or search for “Person Management” to make sure the employees don’t exist.

Performing the Transfer

Now let’s fire-up the SFTP client to perform the transfer.

You will notice the moment the file drops in, it disappears as the MFT server picks it up.

Quickly navigate over to the WebCenter system and perform a search all. You will see that the file has been uploaded. This file will disappear once the HCM server has processed it.

Navigate to the Dashboard of MFT and you see the transfer in process

Click on [Transfer Instances] -> [SFTP to HCM Load] -> [Instances] tab. This shows all transfers that have occurred. In this case we only have 1.

Click on the “Id” to look at the details. You can see a successful visual flow.

Click on the delivery target for more details. You can see further information of the file that was uploaded and the endpoint that was triggered.

Note: In our sample, we are doing a fire-and-forget so we are not polling the HCM system to check if the file has been imported successfully, just that the trigger was successful. We can extend this code to add a further check or for complex orchestrations trigger a Post-Processing call to Integration Cloud Service / SOA Cloud Service instead the HCM Loader directly.

Now let’s navigate over to the HCM system to see what happened. We will go to [Data Exchange] -> [Import and Load Data]

You can see my process has run successfully with no messages and 4 objects have been updated.

Moving back to WebCenter, you will also notice that my file has been consumed.

Note: If the job had not been successful, this file will sit in WebCenter and can be run again manually from [Import and Load Data] by selecting [Import File]. A list of available files from WebCenter will be displayed.

Note also: If you attempt to re-uploaded a file of the same name, it will fail as this file already exists in WebCenter till you purge it!

Synchronizing and Indexing Person Data

Now at this point the records have been inserted into the HCM system. However, they will not show up yet. Two more functions have to be called before we can see them.

In this blog, we are performing this manually. Typically, we would have this configured in HCM as a scheduled job that updates person info periodically.

We could utilise SOA Cloud Service or Integration Cloud Service to orchestrate this post-processing callout. Simply utilise the same code base provided above and connect to an ICS or SOACS SOAP service.

Navigate to [Tools] -> [Schedule Processes]

Click on [Schedule New Process] and search for [Synchronize Person Records]

Run the “Synchronize Person Records” process with the following parameters:

From Date = [Enter the earliest date for which the load could have been run]

Blogroll

RedThunder.blog and contributors. All Rights Reserved. The views expressed in this blog are our own and do not necessarily reflect the views of Oracle Corporation. All content is provided on an ‘as is’ basis, without warranties or conditions of any kind, either express or implied, including, without limitation, any warranties or conditions of title, non-infringement, merchantability, or fitness for a particular purpose. You are solely responsible for determining the appropriateness of using or redistributing and assume any risks.

Follow Blog via Email

Enter your email address to follow this blog and receive notifications of new posts by email.