Whether you're using system.debug(someObject); or logging using a home-rolled log (I do), it'd difficult to read the output of some sobjects if too many of the fields look something like "a_field__c=null, another_field__c=null, yet_another_field__c=null..."

Get rid of all that nonsense using aString.replaceAll('\\w+=null','');.

Wednesday, September 16, 2015

Before computers existed, English professor William Strunk gave sage advice on how to develop software.

"Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts."

Frankly, it was the "...and a machine no unnecessary parts," I thought especially relevant.

While preparing for a presentation at Dreamforce 15 I was reminded of this advice while considering a slide recommending Salesforce developers clean-up their change sets and deployments before moving them into production.

Honestly, during the development of even a single method or subroutine it's possible for there to be code that may have had meaning in the method's early development, but after multiple iterations has been orphaned or otherwise has no impact. It is the developer's responsibility to identify and eliminate unnecessary code and unused variables, just as in a class unused methods should be eliminated (no matter our affection for them). Why would it be such a surprise that unused classes, pages, components, reports, report folders, flows, images, etc. shouldn't also be eliminated before being immortalized in a production system?

"...and a machine no unnecessary parts."

Some integrated development environments (IDEs) have code browsers that are able to identify unreachable code. I'm not yet aware of one for Salesforce development, but if you know of one please share it.

Until then, it is the developer's responsibility to eliminate code and other artifacts from their projects and repositories--remembering to pay as much attention to their static resources, permission sets, profiles, and other configuration elements as to Visualforce and Apex.

Salesforce haystacks are large enough without the code and components that do nothing getting in the way of maintenance, documentation, code coverage, and ultimately process improvement.

Thursday, September 3, 2015

I've been doing a lot of reading the last week or so learning how to mix AngularJS with Visualforce. I've watched videos, read articles, read documentation, but none of them were simple. It's as though developer's couldn't resist showing-off something else, and that something else buried the simplicity of simple JSRemoting calls inside Visualforce.

All I wanted to do is call some already-existing Remote Methods from inside an Angular page, and try to make sure it played nice with the other Angular features, like "promises."

We're going to start with a simple Apex controller with two methods. The first, Divide(), simply divides its first argument by its second and returns the result. As simple as it is it will be valuable later when we test our Angular Javascript to see how exceptions are handled--all we need to do is pass 0 for the second argument to see how exceptions behave.

The second method, Xyzzy(), simply returns a string. All remote and REST classes should have some simple methods that do very little to simplify testing.

The page above output the obligatory "Hello, world!" but functionally does nothing Angular-ish, short of defining an app and giving it a controller. You should make certain the page does very little by inspecting the page from your browser to see what's written out to the console. Knowing what "nothing" looks like is the first step to recognizing when "something" happens and you know whether it was something you intended or not.

The best thing about the page above is it doesn't include anything that distracts from our purpose. There are no stylesheets to wonder whether they're needed and no other Javascript library you may think are required to get a simple example working.

The next thing we're going to do is add our Divide() method. But before we drop it into the Javascript let's look at what it normally looks like inside our non-Angular Javascript.

This is about as simple as JSRemoting code goes. The browser is going to call the Divide() method on the TomTestController class and passes the numbers 1 and 1. When the callout finishes event.status will tell us it worked (true) or failed (false).

In fact, we can put that call into our Javascript right now and run it to see what happens. Update your page so it contains:

To make our remote call work with Angular promises, we need to wrap it inside a function that Angular-izes our call with promises so developers can use the .then().then().catch() code we've been reading so much about.

Our callout is still recognizable, but it has a few new features. Principally, it creates a promise and calls either deferred.resolve() or deferred.reject() depending on the call's success or failure respectively.

Once our function is defined inside Angular's controller we can call it with (1, 1) to see how it works, and how it looks when it works inside the inspector.

But what about errors? What happens when we get an exception? If you haven't already tried it, change the code to Divide(1, 0). What did you get? I got an error warning me, "Visualforce Remoting Exception: Divide by 0" followed by "bad >Object...". When you look at the object sent to the second anonymous function notice that it's the "event" object passed when the code called deferred.reject(event);

Now that you have JSRemoting working inside Angular with promises, now is a good time to play around with it. Below is my addition of Xyzzy(). But sometime tomorrow I think I'll create a remote for Echo() that simply returns its argument, or maybe a quick [ select ... from something ... limit 10 ]; to see what that looks like.

Step 2 - Include your Javascript inside another page

Use <apex:includeScript value="{!$Page.TomTestJS" /> if the Javascript needs to be loaded at the top of the page and <script src="{!$Page.TomTestJS" /> if it needs to be loaded later, perhaps after some content has been rendered to the DOM.

I can think of a few reasons why programmers may want to do this. Coincidentally, there the reasons I've wanted to do this.

It's easier to track the source code in a repository if the files exist as independent entities and not part of a zip file.

It's easier to see when a specific Javasript was last modified

It allows the Visualforce preprocessor to resolve merge fields in the Javascript before being loaded into the browser (for assets that may exist in a static resource or as another page.

It allows what would normally live inside <script /> tags inside a Visualforce page to exist independently, change independently, etc.

There are other reasons, too. Today I had to port some Javascript and HTML from a media team into a sandbox and the team had taken liberties with their templates and other references that required "fixing" to work inside Salesforce. Moving one of these Javascript files into a page and letting the preprocessor take care of a merge field to resolve the location of a static resource worked like a charm.

Visualforce requires ng-app to have a value to pass its own syntax checker. If a value is passed to ng-app to get past Visualforce then that value is interpreted by Angular as a module used to bootstrap your page and must be defined.

In the example above I created a module called "noop" that literally does nothing but take-up space to make something else work.

Now my page behaved just like W3Schools said it should.

Having Googled around some more, I found multiple tutorials and videos introducing the neat things people have done with Visualforce and Angular, but all of them are too complicated for the absolute novice. But the search pages did alert me that Salesforce is so geeked about the combination of Angular and Visualforce that they've created an app for the appexchange that installs angular, underbar, bootstrap, and several other JS libraries. The app is called Angilar MP [sic]. The page gives instructions for how to install it into your org and includes some demo pages showing how to put more complicated examples together.

Since the app loads all those Javascript libraries into a staticresource we can re-write our application to look just a tad more Visualforce-like.

All it really does is replace the <script src="..." /> with <apex:includeScript value="..." /> and use the staticresource's Angular JS source.

PS If you're not already familiar with it, another of the cool resources include in the package is UnderscoreJS. Lots of cool Javascript utility functions in there I wish I'd known were around years ago. Regardless, they'll make my current pages easier to write.

Tuesday, May 5, 2015

If you've ever had a need to remove a bunch of custom objects, fields, pages, classes, etc. from an org, or from multiple orgs you've probably come across documentation about destructiveChanges.xml. If you're familiar with developing on the Salesforce platform using Maven's Mate or Eclipse, you're probably already familiar with package.xml. Both files have nearly identically formats. The difference between them is package.xml enumerates the stuff you want to synchronize between your org and your development environment and destructiveChanges.xml enumerates the items you want to obliterate (or delete) from whatever org you point it at.

The easiest way to see how they're identical is to look at what each of them looks like empty.

The repo includes a directory named "del" (not very imaginative) and inside it are the files destructiveChanges.xml and package.xml. It seems odd to me, but the migration tool requires both the destructiveChanges.xml AND a package.xml to reside there.

The package.xml file is the same empty version as before. But the template's destructiveChanges.xml contains placeholders--but still basically does nothing.

Now that we have a directory with both files in it, and we have versions of those files that basically do nothing, let's get ready to run the tool.

There's one more file we need to create that's required by the tool, build.xml. If you're not already using it for deployments you're likely not using it at all. My version of build.xml is in the parent of del/. You can see it above in the directory listing of xede-sf-template.

Since build.xml is in the parent directory to del/ the "deployRoot" attribute is "del," the subdirectory.

The environment property (<property environment.../>) allows operating system environment variables to be substituted inside your build.xml. In the example above, the environment variables are about what you'd expect them to be (using the bash shell):

Right about now you may be thinking, "Who wants to set all those environment variables?" Truthfully, I don't. That's why I created a little script to do it for me called "build." But before we get into that let's just edit our build.xml file so it doesn't need environment variables.

Ant (with the migration tool plugin) is telling us it tried removing the Apex class "DoesNotExist" but it didn't exist. If the class had existed before but had already been removed this is the message it would display.As a reader exercise, go ahead and create a class "DoesNotExist" in your org. I went into Setup->Classes->New and entered "public class DoesNotExist{}". It's about as useless a class as you can create, though I've seen and perhaps written worse.If you run ant again you'll see it doesn't report an error.

Tuesday, November 18, 2014

Abstract

Problem

How can community users authenticate to Salesforce via the API without having to give their permission?

Answer

Use the JWT Bearer Token Flow

Disclaimer

I was going to wait a while longer before posting this to make sure it was beautifully formatted and brilliantly written--but that wouldn't have helped anyone trying to solve this problem in the meantime (like I was a few weeks back).

So in the spirit of both this blog's name and agile development, I'm publishing it early, perhaps not often, but hopefully in-time for another developer.

Attributions

Thanks to Jim Rae (@JimRae2009) for suggesting this approach, inspired by his work integrating canvas with node.js on the desktop, and his related Dreamforce 2014 presentation.

Background

A client of ours has an existing, non-salesforce, website with LOTS (tens of thousands) of users. The client also has a Salesforce ServiceCloud instance they use for all their customer support, and they wanted their customers to interact with the CRM through their website, without iframes or exposing the SF portal to their users.

The solution is to use JWT Bearer Token Flow. Salesforce does not support username/password authorization for community-licensed users, and the other OAuth flows require a browser to intermediate between two domains.

Though the document above does a good job describing the flow, it's a little weak on specifics. Luckily, there's a reference Apex implementation on github (salesforceidentity/jwt), and below I'll provide a reference implementation using cURL.

Configuring your connected app

But before starting, there's a few things to know about your connected app.

Your connected app must be created inside the target Salesforce instance. You cannot re-use the same consumer and client values across orgs unless your app is part of a package.

Your connected app must also use digital signatures. This will require creating a certificate and private key. Openssl commands for doing this appear later in this article.

You must set the "admin approved users are pre-authorized" permitted users option to avoid login errors.

Configuring your community

The community profile must allow access to the Apex classes that implement your REST interface

Each community user will require the "API Enabled" permission. This cannot be specified at the profile-level.

Creating the certificate

A single openssl command can create your private key and a self-signed certificate.

Substitute your own values for the -subj parameter. It's a self-signed certificate so no one will believe you anyway. The benefit of using the -subj parameter is to avoid answer the questions about the certificate interactively.
The file "public.crt" is the certificate to load into your connected app on Salesforce.

Creating a community user

If you already have a community user you can skip to the next section. If you don't, you will need to create one to test with.

Make sure the user that creates the community user (either from a Contact or an Account if person-accounts are enabled) has a role. Salesforce will complain when the Contact is enabled for login if the creating user doesn't have a role.