From BBC Basic to Force.com and beyond…

Category Archives: Web Services

Its been nearly 9 years since i created my first Salesforce developer account. Back then I was leading a group of architects building on premise enterprise applications with JavaJ2EE and Microsoft .Net. It was fair to say my decision to refocus my career not only in building the first Accounting application in the cloud, but to do so on an emerging and comparatively prescriptive platform, was a risk. Although its not been an easy ride, leading and innovating rarely is, it is a journey that has inspired me and my perspective on successfully delivering enterprise applications.

Clearly since 2008 things have changed a lot! For me though, it was in 2014 when the platform really started to evolve in a significant way, when Lightning struck! It has continued to evolve at an increasingly rapid pace. Not just for the front end architecture, but the backend and latterly the developer tooling as well.

Component and Container Driven UI DevelopmentDecomposing code into smaller reusable units is not exactly new, but has arguably taken time to find its feet in the browser. By making Lightning Components the heart of their next generation framework, Salesforce made decomposition and reuse the primary consideration and moved us away from monolithic page centric thinking. Components need a place to live! With the increase of usability features in the various Lightning containers, namely Experience, Mobile and Community, we are further encouraged to build once run everywhere. Lightning Design System has not only taken the leg work out of creating great UI’s, but also brings with it often forgotten aspects such as keyboard navigation and support for accessibility.

Metadata Driven SolutionsMetadata has been at the heart of Salesforce since the beginning, driving it forward as low code or zero code, high productivity platform for creating solutions and applying customisations. When Salesforce created Custom Metadata, it enabled developers to also deliver solutions that harness these same strengths that has made the platform so successful, driving up productivity and easy of implementation and time scales down.

Event Driven ArchitectureDecomposition of processing is key to scalability and resiliency. While we often get frustrated with the governors, especially in an interactive/synchronous context, the reality is safe guarding server resources responsible for delivering a responsive UI to the user is critical.Batch Apex, Future and Queables have long since been ways to manage async processing. With Platform Events, Salesforce has delivered a much more open and extensible approach to orchestrating processing within the platform as well as off platform. With a wealth of API’s for developers on and off platform, and tooling integration, EDA is now firmly integrated into the platform. Retry semantics with Platform Events is also a welcome addition to what has previously been left to the developer when utilising the aforementioned technologies.

Industry Standards and IntegrationsSalesforce has always been strong in terms of its own API’s, the Enterprise and Partner API’s being the classic go to API’s, now available in REST form. With External Objects and External Services supporting the OData and Swagger industry standards, off platform data sources and external API’s are obtained at a much reduced implementation overhead. Also without the user having to leave behind the value of various platform tools or the latest Lightning user experience.

Open Tools and Source Driven DevelopmentThe tooling ecosystem has been a rich tapestry of story telling and is still emerging. The main focus and desire has been to leverage other industry standard approaches such as Continuous Integration and Deployment, with varying degrees of success. With SalesforceDX going GA, the first wave of change is now with us, with the ability to define, create, manage and destroy development environments at will. With more API’s and Services allowing for richer IDE experiences to be built in a more open and IDE agnostic way. I am very much looking forward to the future of DX, especially upcoming improvements around packaging.

Hybrid ArchitecturesLast but not least, many of the above advancements provide more secure, responsive and integrated options for leveraging services and capabilities of other cloud application platforms. Heroku is the natural choice for those of us wanting to stay within the Salesforce ecosystem. With both Heroku Connect and Salesforce Connect (aka External Objects) integrating and synchronising data is now possible at much greater easy and reliability. Platform Events and External Services also both provide additional means for developers to connect the two platforms and take advantage of broader languages, libraries and additional compute resources. FinancialForce has open sourced an exciting new library, Orizuru, to assist in integrating Force.com and Heroku that will be showcased for the first time at Dreamforce.

The above list is certainly not exhaustive, when you consider Big Data (Big Objects), Analytics (Einstein/Wave Analytics) and of course AI/ML (Einstein Platform). Its a great time to being heading into my 5thDreamforce i am sure the list will grow even further!

Salesforce are on a mission to make accessing off platform data and web services as easy as possible. This helps keep the user experience optimal and consistent for the user and also allows admins to continue to leverage the platforms tools such as Process Builder and Flow, even if the data or logic is not on the platform.

Starting with External Objects, they added the ability to see and also update data stored in external databases. Once setup, users can manipulate external records without leaving Salesforce, by staying within the familiar UI’s. With External Services, currently in Beta, they have extended this concept to external API services.

UPDATE: The ASCIIArt Service covered in this blog has since been updated to use the Swagger schema standard. However this blog is still a very useful introduction to External Services. Once you have read it, head on over to this blog!

In this blog lets first focus on the clicks-not-code steps you can repeat in your own org, to consume a live ASCII Art web service API i have exposed publicly. The API is simple, it takes a message and returns it in ASCII art format. The following steps result in a working UI to call the API and update a record.

After the clicks not code bit i will share how the API was built, whats required for compatibility with this feature and how insanely easy it is to develop Web Services in Heroku using Nodejs. So lets dive in to External Services!

Building an ASCII Art Converter in Lightning Experience and Flow

The above solution was built with the following configurations / components. All of which are accessible under the LEX Setup menu (required for External Services) and takes around 5 minutes maximum to get up and running.

Named Credential for the URL of the Web Service

External Service for the URL, referencing the Named Credential

Visual Flow to present a UI, call the External Service and update a record

Lightning Record Page customisation to embed the Flow in the UI

I created myself a Custom Object, called Message, but you can easily adapt the following to any object you want, you just need a Rich Text field to store the result in. The only other thing you need to know of course is the web service URL.

https://createasciiart.herokuapp.com

Can i use External Services with any Web Service then?

In order to build technologies that simplify what are normally things developers have to interpret and code manually. Web Service APIs must be documented in a way that External Services can understand. In this Beta release this is the Interagent schema standard (created by Heroku as it happens). Support for the more broadly adopted Swagger / OpenId will be added in the Winter release (Safe Harbour).

For my ASCII Art service above, i authored the Interagent schema based on a sample the Salesforce PM for this feature kindly shared, more on this later. When creating the External Service in moment we will provide a schema to this service.

https://createasciiart.herokuapp.com/schema

Creating a Named Credential

From the setup menu search for Named Credential and click New. This is a simple Web Service that requires no authentication. Basically provide only the part of the above URL that points to the Web Service endpoint.

Creating the External Service

Now for the magic! Under the Setup menu (only in Lightning Experience) search for Integrations and start the wizard. Its a pretty straight forward process, of selecting the above Named Credential, then telling it the URL for the schema. If thats not exposed by the service you want to use, you can paste a Schema in directly (which lets a developer define a schema yourself if one does not already exist).

Once you have created the External Service you can review the operations it has discovered. Salesforce uses the documentation embedded in the given schema to display a rather pleasing summary actually.

So what just happened? Well… internally the wizard wrote some Apex code on your behalf and implemented the Invocable Method annotations to enable that Apex code to appear in tools like Process Builder (not supported in Beta) and Flow. Pretty cool!

Whats more interesting for those wondering, is you cannot actually see this Apex code, its there but some how magically managed by the platform. Though i’ve not confirmed, i would assume it does not require code coverage.

Update: According to the PM, in Winter’18 it will be possible “see” the generated class from other Apex classes and thus reuse the generated code from Apex as well. Kind of like a Api Stub Generator.

Creating a UI to call the External Service via Flow

This simple Flow prompts the user for a message to convert, calls the External Service and updates a Rich Text field on the record with the response. You will see in the Flow sidebar the generated Apex class generated by the External Service appears.

The following screenshots show some of the key steps involved in setting up the Flow and its three steps, including making a Flow variable for the record Id. This is later used when embedding the Flow in Lightning Experience in the next step.

RecordId used by Flow Lightning Component

Assign the message service parameter

Assign the response to variable

Update the Rich Text field

TIP: When setting the ASCII Art service response into the field, i wrapped the value in the HTML elements, pre and code to ensure the use of a monospaced font when the Rich Text field displayed the value.

Creating your own API for use with External Services

Belinda, the PM for this feature was also kind enough to share the sample code for the example shown in TrailheaDX, from which the service in this blog is based. However i did wanted to build my own version to do something different from the credit example. Also extend my personal experience with Heroku and Nodejs more.

The NodeJS code for this solution is only 41 lines long. It runs up a web server (using the very easy to use hapi library), and registers a couple of handlers. One handler returns the statically defined schema.json file, the other implements the service itself. As side note, the joi library is an easy way add validation to the service parameters.

I decided i wanted to explore the diversity of whats available in the Nodejs space, through npm. To keep things light i chose to have a bit of fun and quickly found an ASCIIArt library, called figlet. Though i soon discovered that npm had a library for pretty much every other use case i came up with!

Finally the hand written Interagent schema is also shown below and is reasonably short and easy to understand for this example. Its not all that well documented in layman’s terms as far as i can see. See my thoughts on this and upcoming Swagger support below.

Other Observations and Thoughts…

Error Handling.
You can handle errors from the service in the usual way by using the Fault path from the element. The error shown is not all that pretty, but then in fairness there is not really much of a standard to follow here.

Can a Web Service called this way talk back to Salesforce?
Flow provides various system variables, one of which is the Session Id. Thus you could pass this as an argument to your Web Service. Be careful though as the running user may not have Salesforce API access and this will be a UI session and thus will be short lived. Thus you may want to explore another means to obtain an renewable oAuth token for more advanced uses.

Web Service Callbacks.
Currently in the Beta the Flow is blocked until the Web Service returns, so its good practice to make your service short and sweet. Salesforce are planning async support as part of the roadmap however.

Complex Parameters.
Its unclear at this stage how complex a web service can be supported given Flows limitations around Invocable Methods which this feature depends on.

Summary

Both External Objects and External Services reflect the reality of the continued need for integration tools and making this process simpler and thus cheaper. Separate services and data repositories are for now here to stay. I’m really pleased to see Salesforce doing what it does best, making complex things easier for the masses. Or as Einstein would say…“Everything should be made as simple as possible, but no simpler“.

Finally you can read more about External Objects here and here through Agustina’s and laterally Alba’s excellent blogs.

Since I created the Apex Metadata API I get to help all sorts of folks building many different and cool things with it. One that is quite common is to automate post package installation tasks such as updating layouts or picklist values, which are not auto upgraded by the platform.

What makes these use cases a little awkward and less user friendly is that the user installing the package has to perform at least one manual post install step. That is to create the Remote Site Setting to ironically allow the native Apex code to call a native SOAP or REST platform API (see Idea here to remove this need).

What makes it more tricky, is the URL the package install user has to provide for the Remote Site Setting is quite specialised, it has to contain the orgs instance name, the package namespace (if your using the Apex Metadata API from a Visualforce page) and potentially the orgs custom domain name.

As those that follow my blog will know my Declarative Rollup Summary Tool uses the Apex Metadata API and has the same requirement. Since i have defined the Apex Exception User on my package, i get an email notification each time an uncaught exception occurs in the tool. Despite putting instructions on a Welcome page in the tool to setup the Remote Site Setting, i can see by the frequency of System.CalloutException exception i get that this manual post install step is initially being missed.

About the same time i decided to raise an enhancement to remind me to look for a solution or at least better way to indicate this step to the user. I found this question Accessing Metadata API from JavaScript on StackExchange, referencing this earlier one Dynamically set Remote Site in Apex. The infamousMr Fox had found the answer! Which was to call the Metadata API initially via JavaScript (which is not bound by the Remote Site Setting check). Thankfully since Summer’13 Salesforce supports making API calls from the Visualforce domains, otherwise this would not have been possible due to the browsers cross domain checking! Phew!

I already had configured on the landing tab of the tool a Welcome Visualforce page, so i decided to put the code to auto create the Remote Site Setting here. Note that since Summer’14 you can now also configure a post install page on your package, this would also be a good place. Here is my implementation of the answer given, with a few tweaks, UI and error handling.

You can see that I’ve offered the user two ways, manual and pressing the button. I chose to do this, so that admins would still be aware of what the tool was doing to the configuration in the org, which i feel when using the Metadata API is quite important. Once they press the button it makes the JavaScript callout to the Metadata API and the results are passed back to an apex:actionFunction to refresh the page.

First I wrote some code to test the API connection, which basically made a callout to the Apex Metadata API and caught any exception. I chose to use the listMetadata API call, but it could have been any. Its not the response we are interested in here, it is if it throws an exception or not. Unfortunately there is no language neutral way of checking the lack of Remote Site Setting, so for now the assumption is any CalloutException is a result of this.

Next I wrote a Visualforce Controller to call this and also calculate the URL needed by the Remote Site Setting. I did this via the Host HTTP Header, as this will include in a subscriber org all of the above attributes.

Finally the Visualforce page with the JavaScript callout to the Metadata API in it! The code constructs the SOAP XML, makes the call and parses the result for any errors. Before calling an apex:actionFunction to refresh the page (only the key parts of the full page are shown below).

Its worth noting that this solution would also apply to configuring a Remote Site for other types of Salesforce API callouts, such as Apex Tooling API, since its just the Domain part of the URL that needs to be added. Note you may need to add more Remote Sites if your also planning on calling from a Batch Apex context for example.

I’m really pleased with this implementation, but it is a little two baked into the rollup tool at present. This would make a great addition to the Apex Metadata API library (or maybe something more standalone) as a Visualforce Component for example, so that all you need to do is include the component on your welcome or post install pages to use it. Something for another day or a fellow open source developer to think about perhaps!?!

Security Review Notes: This approach has not gone through Security Review as yet. My feeling is that it should pass as there is president for calling Salesforce SOAP/REST API’s from JavaScript already and indeed was the main reason for Salesforce enabling the API endpoints from a VF domain as noted above in Summer’13. Nor does this approach bypass the CRUD, FLS or Permissions needed by such API’s, e.g. you still need to have Author Apex to permit Metadata API calls regardless of where they are called from. Finally the design approach here is user driven (e.g. they have to press abutton) rather than automated on page load (which they do generally discourage of course).

Like this:

It seems like an absolute age since i first blogged about the new features Salesforce have added to the Spring’14 version of the Metadata API. This weekend my development org was finally upgraded and I was able to continue the work to upgrade the Apex wrapper around this API. The work is now complete and is formally the hardest and most time consuming upgrade yet, ultimately though the most worthwhile in terms of the way the API can now be used in Apex, it really feels like it has now finally unlocked the keys to the kingdom for Apex developers!

The last blog presented an example of reading and updating a Layout. With this release I’ve now extended this to all Metadata types. The introduction of the real time variants of the CRUD operations is a significant step forward in the usability of this library, particularly if your want to update rather than just create things.

Prior to Spring’14 you had to jump through quite a few hoops to read metadata, involving zip file handling in JavaScript and AJAX or Batch Apex code for handling asynchronous responses, as described in this blog. The async API variants still exist and are still supported, so all your code that used them before still works. There is also some reasons to consider still to use them, which i’ll cover later. Lets get into some new examples, all of which are available in MetadataServiceExamples.cls

More possibilities…

Salesforce are doing an amazing job with Metadata API coverage these days. If you can see it in the Setup menu, i’d say there is a good chance you can find it in the Metadata API list of support Metadata types. Here are a few interesting use cases that you might want to think about…

Copy or merge tools. Ability to save admins potentially a significant number of clicks by writing tools to copy or merge Setup configuration.

Setup Wizards. Scripting the creation more complex Setup config, such as Force.com sites or templating the growing set of org Settings.

Field usage reporting. Read layouts, validation rules, workflows and search for field references.

Writing your own Change Sets, You could write your own scripts to move specific metadata and/or sync things like picklists between orgs.

Post install upgrades. Configuration UI to perform post install upgrades to none upgradable components, such as picklists and layouts in the subscriber org.

NOTE: One thing still not possible via the above CRUD methods is the creation of Apex Class or Apex Triggers, the only way (including the Tooling API) of doing this in a Production org, is still via the Metadata APIdeploy operation, thus the sample code for this in the library is still valid.

Upgrading to the new MetadataService.cls

For the first time since I’ve been refreshing the Apex wrapper MetadataService.cls, some of the metadata type classes and method signatures have changed. This is due to Salesforce changing the Metadata API of course, which they can do because they version at the end point, not at the WSDL level. I’m open to feedback on ways to avoid this in the future, such as the version on the end of the class, e.g. MetadataService30. The changes in this case which i have observed will effect require small changes to your use of the deploy method…

MetadataService.checkDeployStatus now takes a boolean argument, called includeDetails, read more here,

MetadataService.DeployResult class has a new details property replacing messages, read more here.

NOTE: The deploy example has been updated in the library with the above changes.

Scripting the Release Upgrade Process

I first described in my previous blog, that i have invested in developing a patcher script, a peace of Apex code i have written that parses the default Apex code output from the Generate Apex from WSDL button and automatically applies the usual changes i have been making manually to date to get things to work, plus some new changes to support the new readMetadata operation. I have included the patcher code in the library if you want to take a look at it and have updated the steps in the readme to create your own MetadataService.cls should you want to (in the event i don’t update it in the future for example). You do not need to worry about this if you plan to take the pre-patched MetadataService.cls from the repository, the above is really just an FYI.

Unsupported Metadata Types

The following Metadata types are not supported, as they utilise some specific features the above Patcher script does not support in relation to the use of shared Metadata type base classes. I’m confident these can be supported in the future and will like be working on these as part of the next release. I’ll be giving priority to those that have been the subject of recent questions , such as Flow and Sharing Rules.

Flow

Sharing Rules

Role

Custom Shortcut

Email Folder

Document Folder

Summary

Since its creation, i have been getting a steady flow of questions and requests for this library (evidence by the large examples class these days), so i can see there is demand for it in general, personally i’d still like to see it as a native library without the need to make the callouts. However until that time comes I’ll continue to support the community. Please do let me know what you’ve been getting up to with it and join in creating great community tools to help admins and developers be more productive!

Share this:

Like this:

The upcoming Spring’14 Metadata API brings with it some great new architectural features.

Ability to create, update and delete immediately without having to callback for the result

Ability to read Metadata without having to use retrieve and unzip the response!

Ability to rename Metadata components

Plus a host of new Metadata types and extensions to existing

The most exciting for me is the first two, first up the ability to greatly simplify the use of the API, particularly in Apex. As per a previous blog post I covered the requirement to poll for the results of certain API operations via the checkSync and checkDeploy methods. Which involved the use of either Batch Apex or apex:actionPoller. Many posters on this blog and the GitHub repo found this awkward to code.

These original operations still exist and I’ll continue to support them, but we now have some much much easier variants that simply return immediately the result. Here is an example showing the new readMetadata operation in action!

With a true read facility in the API, imagine the possibilities with the Layouts alone, such as merging layouts, dynamically adding or removing buttons, fields, sections from a selection of layouts the user selections. If your developing a packaged application, providing a means to upgrade your users layouts and picklists automatically!

No Batch Apex, no apex:actionPoller to be seen! Its still early days and the official docs have not been announced, but from what I’d expect from how the readMetadata operation is designed, you’re going to be able to retrieve other types as well, such as ListView, Reports etc. Suddenly creating Setup wizards has become much easier, as well as custom Setup reporting tools, for example retrieving all the layouts and dumping the fields used in a Custom Object for reporting!

No More Zip File Handling!

The other really exciting thing here is we finally don’t need the help of JSZip JavaScript library to extract Metadata information from a retrieve operation call. While I’m still quite pleased with how i managed to abstract this. i’m very pleased that we can now read Metadata in none Visualforce contexts, such as Batch Apex or Apex Triggers.

Automating the Changes to the Generated Code

As always the code generated by the platform via the WSDL2Apex ‘Generate Apex from WSDL‘ button didn’t quite generate the Apex needed. Each time I refresh the Apex wrapper I have had to manually apply the changes to get it working again. This time i invested some time to write a script, written itself in Apex, that will parse the generated code and patch it with the required changes. It uses a customer Iterator to read the lines of code and the Tooling API to update the Apex class when its done.

Working around xsi:type

Once again the use of xsd:extension in the Metadata API proved to be a challenge in getting the readMetadata operation to work. Since as I found before this XML Schema construct is not readily supported by WSDL2Apex or the Apex XML bindings at runtime. The main weakness is the lack of interpretation of the xsi:type attribute, which identifies which Metadata type (e.g. CustomObject, Layout etc) the XML represents. As soon as I tried the readMetadata operation it gave an error as it tried to de-serialise the Layout, into the base Metadata type (which only contains fullName).

To get things to work I had to create some new Apex classes representing the responses from the readMetadata operation for each of the Metadata types. Fortunately the new script i developed can be taught to generate these for me! I’ll eventually post more on the changes to get the readMetadata operation working on the GitHub repo Readme.

Spring’14 is around the corner!

I’ll formally update the library once Spring’14 has been rolled out to the production servers in February. Until then you can download the MetadataService.cls and MetadataServiceExamples.cls classes from this Gist here. You can try it out in your pre-release orgs if you like. I’ll also post a few more fancy examples as well.

Share this:

Like this:

Force.com provides a means to generate Apex classes that allow the calling of a given Web Service as described by a WSDL (Web Service Definition Language). This tool is often referred to as the WSD2Apex tool. Despite not having any real “logic” in them, these classes also need code coverage in order to deploy to Production or include a Package.

While the tests for your Apex code that calls the Web Service indirectly ensures you obtain a certain amount of coverage of these classes, it may not be enough. Since you may only require use of a subset of the types and methods generated. The solution is often to comment out the bits not needed, however this is less than ideal if you plan on regenerating the Apex classes, when the Web Service is updated.

This short blog illustrates a way to generically cover the types and methods generated by the WSDL2Apex tool. Such that you don’t need to modify the generated code and can freely update it as desired, adding or removing types or methods in your test class accordingly. It utilises the UPS Street Address Web Service as per this Stack Exchange question, it requires a small tweak to the WSDL before doing so.

Step 1. Covering Generated Types. Each of the inner classes generated represents the data types from the WSDL (sometimes these are split into separate Apex classes). While they don’t have any methods in them, they do have static initialisation code. Constructing each of these classes will execute this logic and generate coverage.

Step 2. Covering Generated Methods. Each of the methods on the Port inner class represents the operations described in the WSDL. Fortunately these methods do not care much about the data flowing in or out of them, which makes it easier to create a generic Web Service mock implementation.

For each of these in the generated class methods, observe the types used on lines 9 and 10.

Create a test method to cover the generated methods, for each of the methods in generated code repeat line 6 for each. Note that you don’t need to worry about the values being provided to the methods, as the Web Service mock does nothing with them at all. Note that the test context still limits the number of callouts to 10 per test method, so you may need to split the method calls across two test methods.

Summary. If you want to see a full example of this type of test check out this test class based on the Salesforce Metadata API Web Service. This approach may not be for everyone, certainly if you are already covering a large portion of the generated code or prefer to just delete / comment out the code you don’t need it. However if your providing some kind of connector library or you just want to retain the ability to upgrade the Web Service more easily, or your just determined to keep your 100% code coverage, this might help!