Latest

Hey guys,
Long time no blog! Sorry about that, been kind of busy and honestly haven’t had too many interesting tidbits to share. However, I think I have something kind of neat to show you. I had a project recently where the user wanted to be to create a custom SOQL query and export the results as a CSV file. I don’t know why they didn’t want to use regular reports and export (my guess is they figured the query may be too complex or something) but it sounded fun to write, so I didn’t argue.

Breaking this requirement down into it’s individual parts revealed the challenges I’d have to figure out solutions for:
1) Allow a user to create a custom SOQL query through the standard interface
2) Extract and iterate over the fields queried for to create the column headings
3) Properly format the query results as a CSV file
4) Provided the proper MIME type for the visualforce page to prompt the browser to download the generated file

As it turns out, most of this was pretty easy. I decided to create a custom object called ‘SOQL_Query_Export__c’ where a user could create a record then specify the object to query against, the fields to get, the where condition, order by and limit statements. This would allow for many different queries to be easily created and saved, or shared between orgs. Obviously the user would have to know how to write SOQL in the first place, but in this requirement that seemed alright. The benefit as well is that an admin could pre-write a query, then users could just run it whenever.

With my data model/object created now I set about writing the apex controller. I’ll post it, and explain it after.

publicclassSOQL_Export{publicSOQL_Query_Export__cexporter{get;set;}publiclist<sobject>queryResults{get;set;}publiclist<string>queryFields{get;set;}publicstringqueryString{get;set;}publicstringfileName{get;set;}publicSOQL_Export(ApexPages.StandardControllercontroller){//Because the fields of the exporter object are not refernced on the visualforce page we need to explicity tell the controller//to include them. Instead of hard coding in the names of the fields I want to reference, I simply describe the exporter object//and use the keyset of the fieldMap to include all the existing fields of the exporter object.//describe objectMap<String,Schema.SObjectField>fieldMap=Schema.SOQL_Query_Export__c.sObjectType.getDescribe().fields.getMap();//create list of fields from fields maplist<string>fields=newlist<string>(fieldMap.keySet());//add fields to controllerif(!Test.isRunningTest()){controller.addFields(fields);}//get the controller valueexporter=(SOQL_Query_Export__c)controller.getRecord();//create a filename for this exported filefileName=exporter.name+' '+string.valueOf(dateTime.now());//get the proper SOQL order direction from the order direction on the exporter object (Ascending = asc, Descending = desc)stringorderDirection=exporter.Order_Direction__c=='Ascending'?'asc':'desc';//create a list of fields from the comma separated list the user entered in the config objectqueryFields=exporter.fields__c.split(',');//create the query string using string appending and some ternary logicqueryString='select'+exporter.fields__c+'from'+exporter.object_name__c;queryString+=exporter.where_condition__c!=null?'where'+exporter.where_condition__c:'';queryString+=exporter.Order_by__c!=null?'orderby'+exporter.Order_by__c+' '+orderDirection:'';queryString+=exporter.Limit__c!=null?'limit'+string.valueOf(exporter.Limit__c):'limit10000';//run the queryqueryResults=database.query(queryString);}//creates and returns a newline character for the CSV export. Seems kind of hacky I know, but there does not seem to be a better//way to generate a newline character within visualforce itself.publicstaticStringgetNewLine(){return'\n';}}

Because I was going to use the SOQL_Query_Export__c object as the standard controller my apex class would be an extension. This meant using the controller.addFields method (fields not explicitly added by the addFields method or referenced in the visualforce page are not available on the record passed into the controller. So if I had attempted to reference SOQL_Query_Export__c.Name without putting it in my add fields method, or referencing it on the invoking page it would not be available). Since my visualforce page was only going to be outputting CSV content, I have to manually add the fields I want to reference. I decided instead of hard coding that list, I’d make it dynamic. I did this by describing the the SOQL_Query_Export__c object and passing the fields.getMap() keyset to the controller.addFields method. Also, just as something to know, test classes cannot use the addFields method, so wrap that part in an if statement.

Next it’s just simple work of constructing a filename for the generated file, splitting the fields (so I can get an array I can loop over to generate the column headers for the CSV file). Then it’s just generating the actual query string. I used some ternary statements since things like order by and limit are not really required. I did include a hard limit of 10000 records if one isn’t specified since that is the largest a read only collection of sobjects can be. Finally we just run the query. That last method in the class is used by the visualforce page to generate proper CSV line breaks (since you can’t do it within the page itself. Weird I know).

I know the code looks kind of run together. That is on purpose to prevent unwanted line breaks and such in the generated CSV file. Anyway, the first line sets up the page itself obviously. Removes the stylesheets, header, footer, and turns on caching. Now there are two reasonably important things here. The readOnly attribute allows a visualforce collection to be 10000 records instead of only 1000, very useful for a query exporter. The second is the ‘contentType=”application/octet-stream#{!fileName}.csv”‘ part. That tells the browser to treat the generated content as a CSV file, which in most browsers should prompt a download. You can also see that the filename is an Apex property that was generated by the class.

With the page setup, now we just need to construct the actual CSV values. To create the headers of the file, we simply iterate over that list of fields we split in the controller, putting a comma after each one (according to CSV spec trailing commas are not a problem so I didn’t worry about them). You can see I also invoke the {!newLine} method to create a proper CSV style newline after the header row. If anyone knows of a way to generate a newline character in pure visualforce I’d love to hear it, because I couldn’t find a way.

Lastly we iterate over the query results. For each record in the query, we then iterate over each fields. Using the bracket notation we can the field from the record dynamically. Again we create a newline at the end of each record. After this on the SOQL Export object I simple created a button that invoked this page passing in its record ID. That newly opened window would provide the download and then user could then close it (I’m experimenting with ways to automatically close the window once the download is done, but it’s a low priority and any solution would be rather hacky).

There you have it. A simple SOQL query export tool. I have this packaged up, but I’m not 100% I can give that URL away right now. I’ll update this entry if it turns out I’m allowed to share it. Anyway, hope this someone, or if nothing else shows a couple neat techniques you might be able to use.

Just a little quick fix post here, a silly little bug that took me a bit of time to hunt down (probably just because I hadn’t had enough coffee yet). Anyway, the error happens when trying to merge two accounts together. I was getting the error ‘entity is deleted’. The only thing that made my code any different from other examples was that, the account I was trying to merge was being selected by picking it from a lookup on the master. The basic code looked like this (masterAccount was being set by the constructor for the class, so it is already setup properly).

Can you spot the problem here? Yup, because the Merge_With__c field on the master account would now be referencing an account that doesn’t exist (since after a merge the child records get removed) it was throwing that error. So simple once you realize it. Of course the fix for it is pretty easy as well. Just null out the lookup field before the merge call.

There you have it. I realize this is probably kind of a ‘duh’ post but it had me stumped for a few minutes, and I’m mostly just trying to get back into the swing of blogging more regularly, so I figured I’d start with something easy. ‘Till next time!

Hey Yall.
Yes it’s Dreamforce. No I’m not there. No nobody is going to read this blog. Yes it contains a ghetto hack. With that said, lets proceed with the madness.

It was a dark and stormy night. There I was working on this visualforce component, which needed to have a button that would take a user to a new record. That record had to have some information pre-populated so I used the classic URL hack of putting in field IDs and their values into the URL to pass them along (You know how you can pass in something like ?what_id=0036000001nOfOh on the end of a new task url and the what_id field will be populated? Well you can do it with custom fields too by passing in the ID of the field, which isn’t the devleoper name, but a separate ID much like the ID of any other sObject). The problem I then realized that since this code was for a package that could be installed in any org, those Id’s would change, and would have to be fixed for every install of the package in every org. Some poor schmuck would have to go find the IDs of the fields and update the custom button code in every single org.

The having to modify ID’s for fields in every org this package is installed in… absolutely unacceptable.

Sure I could make it a little less painful by using a custom label or setting to read the value from so the person fixing it wouldn’t have to mess with code, but that still sucks. So knowing that you needed the the custom fields Id to do the pre-populate hack, I set out looking for a way to retrieve the ID programatically through apex. As it turns out, this isn’t exactly easy. Because Salesforce hates the URL pre-populate hack (hey we wouldn’t use it if we had a better alternative) and there isn’t much else you can do with the field’s ID, they don’t make it available. It’s not in any describe information anywhere…. except in the tooling API describe calls.

For those unaware the tooling api is a new-ish api Salesforce released to allow you to do more system admin kind of things through programatically. Create objects, fields, check the status of batch jobs and code coverage test results, etc. Mostly intended to allow you to build other cool applications that integrate with Salesforce even more, and maybe even some kind of cool alternative admin interface. But we are going to slap it around and make it give us the field ID we need.

I beat those API’s like they owe me money….. I have no idea what I am talking about anymore.

So anywho, first thing’s first, if we intend to call out to an API, any API, we are going to need a remote site exception. So add your own instance URL as an exception if you havn’t already. You know the drill, security->remote site settings-> copy your org url up until right after the .com. Save that. Now time for some code.

publicstaticlist<map<string,object>>getFieldMetaData(stringfieldName){list<map<string,object>>results=newlist<map<string,object>>();fieldName=fieldName.replace('__c','');stringinstanceURL=System.URL.getSalesforceBaseUrl().getHost().remove('-api');HttpRequestreq=newHttpRequest();req.setHeader('Authorization','Bearer'+UserInfo.getSessionID());req.setHeader('Content-Type','application/json');Stringtoolingendpoint='https://'+instanceURL+'/services/data/v28.0/tooling/';//query for custom fieldstoolingendpoint+='query/?q=Select+id,DeveloperName,FullName+from+CustomField+where+DeveloperName+=+\''+fieldName+'\'';req.setEndpoint(toolingendpoint);req.setMethod('GET');Httph=newHttp();HttpResponseres=h.send(req);//convert the original data structure into a map of objects. The data we want is in the records property of this objectmap<string,object>reqData=(map<string,object>)json.deserializeUntyped(res.getBody());//now create a list of objects from the records property. This serialize/deserialize trick is the only way I know to convert a generic object//into something else when the source data is 'salesforce Map encoded' (no quotes around field names or values, parenthesis to denote open and close, etc)list<object>fieldData=(list<object>)JSON.deserializeUntyped(JSON.serialize(reqData.get('records')));//iterate over each object in the list and create a map of string to object out of it and add it to the listfor(objectthisObj:fieldData){map<string,object>thisFieldData=(map<string,object>)json.deserializeUntyped(JSON.serialize(thisObj));results.add(thisFieldData);}returnresults;}

Stick this in some utilities class somewhere, and give it a shot. Pass in the name of a field, and you should get back the field ID. Make sure it’s a custom field though, since that is the table that is being queried.

Now I’ll go ahead and say that this method isn’t exactly the cleanest thing in the world. There is a lot of serializing and deserializing going on, and I’m not really wild about it. However in the spirit of keeping this call light weight and easy to use (no additional classes for the JSON deserializing are required) it’s a trade off that I am okay with. This way you can easy add fields to your query if you want, and they’ll just show up in the resulting object. Also the semi-complex nature of the returned JSON makes trying to create the proper Apex class for deserializing a bit more annoying since you can’t have nested inner classes (without the whole thing having it’s own file, which I hate because it feels like it clutters up my org with single use class files). ANYWAY point being is that you get back a nice data object that has the fields as it’s keys and their corresponding values. Because the way the tooling API works doesn’t seem to let you filter by sObject type when querying for custom fields, you’ll have to do that part yourself. The fullname field does have the name of the object, but you can’t run a query filter on it, and I figured I’d leave that part an an exercise for the reader. Simply iterate over the resulting list, evaluate the fullname field to see if it contains the name of the object you want, and then read the field ID.

So now with the field ID gettable, you can put it into your URL hacks from your visualforce pages. Simply call the method, use the resulting ID’s to build your URL. Honestly I’d recommend storing the ID’s in a custom setting or something so you don’t have to be waiting on an HTTP request to run and all this JSON to parse every time you want to construct the URL, but that’s just me. Anyway, hope you guys like this and find it useful. Till next time.

-Kenji

(Also, I’d love to see a more efficient, less serializing, deserializing version of my function if you think you can make it better).

Kind of a quick yet cool post for you today. Have you ever wanted to be able to iterate over the properties of a custom class/object? Maybe wanted to read out all the values, or for some other reason (such as serializing the object perhaps) wanted to be able to figure out what all properties an object contained but couldn’t find a way? We all know Apex has come a long way, but it still is lacking a few core features, reflection being one of them. Recently I had a requirement were I wanted to be able to take an object and serialize it into URL format. I didn’t want to have to have to manually type out every property of the object since it could change, and I’m lazy like that. Without reflection this seems impossible, but it’s not!

Remembering that the deserialize json method that Apex has is capable of creating an iteratable version of an object by casting it into a list, or a map suddenly it becomes much more viable. Check it out.

There you have it. By simply simply serializing an object, then deserializing it, we can now iterate over it. Pretty slick eh? Not perfect I know, and doesn’t work awesome for complex objects, but it’s better than nothing until Apex introduces some real reflection abilities.

Long time no post! I’ve been on vacation and in general just being kind of lazy, but today I’ve got a simple fun project for us. You see, my girlfriend is always right, well almost always. Very rarely I’ll remember something correctly, but in general she’s always correct (and not in the ‘haha men are so dumb, women know everything’ way, actually legit she remembers way more stuff than me). This phenomenon has gotten so pervasive that I just for kicks wanted to create a live chart running in the house display how often either of us was right about stuff (I know I’ll regret this eventually). So for my mini project I had a few goals

1) Have a live chart that updates automatically on a TV in my house (we have an extra TV that we generally just use a media center/music streaming box via a chomecast)

2) Make an easy interface to add new data to the chart

3) Make the chart slick looking

4) Keep it simple. This is basically a hobby project so I don’t want to go too nuts.

Please close it when you are done though, my dev org only gets so many HTTP requests per day (note to self, add some kind of global request caching or something).

I was able to complete this project in about an hour and a half and meet all my goals. So now I’ll show you how.

Right off the bat I had a general idea of how I would do this (though the approach did morph a bit). From a previous project I knew it was possible that store and retrieve data in a google spreadsheet. You can get the raw CSV data by using a special URL, and them import that via an http request from an Apex controller. I figured this was easier than setting up a salesforce object, creating a custom interface for adding data, and hell it’s cool to be able to utilize google forms data for something.

My basic form for collecting data

From there it’s just a matter of passing the data to a chart system, and making it poll the sheet occasionally. So anyway, first off we are going to need a google form to collect our data. Head to google docs, and create a new spreadsheet. Use the forms menu to create a new form for your page. In my case, it’s just a simple single question multiple choice (with an other option). Each time the form is submitted it puts the name, and a timestamp into a sheet called ‘Form Responses 1′. This data format works pretty well. I played around with trying to create another sheet that used queryIf to sum all the times various names appeared in the sheet, but that approach had a limiting factor of only working for names I pre-coded it for. It wasn’t dynamic enough. So I decided to just let google collect the data, and I’d handle the summing and formatting in my code.

Your form should be gathering data in a way that looks something like this

To actually get the data in a usable form for programming, we need a raw csv version of it. Thankfully google will provide this for you (though they aren’t exactly forthcoming with it). As of this writting, so get the raw CSV of your sheet, go to file and hit publish. Just publish the one sheet. You should be given a shareable url with a long unique looking id string. Take that and put it into this URL format

Just replace the word key with your documents unique ID. You should be able to put that URL in your browser and it should automatically attempt to download your spreadsheet in CSV format. If so, you are in good shape. If not, make sure you published it, and it’s shared and all that good stuff. Once you have that working we can move to the next step.

Publish your form results sheet and make note of that unique ID, you’ll need it!

So now that the data exists and is accessible we need to GET it. I decided because it’s the easiest publishing platform I know I’d just use Salesforce sites. So that means Apex is going to be my back end. So I’ll need an Apex call to fetch the CSV data from the google sheet, and some code to parse that CSV into some kind of logical structure. Again thankfully from past projects, I had just such a a class.

//gets CSV data from a given URL and parses it into a list of listsglobalclassRightChartController{publicStringgetDataSourceUrl(){return'Your google document url here';}//gets CSV data from a given source@remoteActionglobalstaticList<List<String>>importCSV(stringurl){List<List<String>>result=newList<List<String>>();try{stringresponseBody;//create http request to get import data fromHttpRequestreq=newHttpRequest();req.setEndpoint(url);req.setMethod('GET');Httphttp=newHttp();//if this is not a test actually send the http request. if it is a test, hard code the returned results.if(!Test.isRunningTest()){HTTPResponseres=http.send(req);responseBody=res.getBody();}else{responseBody='Name,Count\ntammy,10\njoe,5\nFrank,0';}//the data should come back in in CSV format, so hand it off the the parsing function which will make a list of a list of strings (each list is one row, each item within that sub list is one column)result=RightChartController.parseCSV(responseBody,true);}catch(exceptione){system.debug('\n\n\n\n-----------------------------Errorimportingchartdata.'+e.getMessage()+'online'+e.getLineNumber());}returnresult;}//parses a csv file. REturns a list of lists. Each main list is a row, and the list contained is all the columns.publicstaticList<List<String>>parseCSV(Stringcontents,BooleanskipHeaders){List<List<String>>allFields=newList<List<String>>();// replace instances where a double quote begins a field containing a comma// in this case you get a double quote followed by a doubled double quote// do this for beginning and end of a fieldcontents=contents.replaceAll(',"""',',"DBLQT').replaceall('""",','DBLQT",');// now replace all remaining double quotes - we do this so that we can reconstruct// fields with commas inside assuming they begin and end with a double quotecontents=contents.replaceAll('""','DBLQT');// we are not attempting to handle fields with a newline inside of them// so, split on newline to get the spreadsheet rowsList<String>lines=newList<String>();try{lines=contents.split('\n');}catch(System.ListExceptione){System.debug('Limitsexceeded?'+e.getMessage());}Integernum=0;for(Stringline:lines){// check for blank CSV lines (only commas)if(line.replaceAll(',','').trim().length()==0)break;List<String>fields=line.split(',');List<String>cleanFields=newList<String>();StringcompositeField;BooleanmakeCompositeField=false;for(Stringfield:fields){if(field.startsWith('"')&&field.endsWith('"')){cleanFields.add(field.replaceAll('DBLQT','"')); } else if (field.startsWith('"')){makeCompositeField=true;compositeField=field;}elseif(field.endsWith('"')){compositeField+=','+field;cleanFields.add(compositeField.replaceAll('DBLQT','"')); makeCompositeField = false; } else if (makeCompositeField) { compositeField += ',' + field; } else { cleanFields.add(field.replaceAll('DBLQT','"'));}}allFields.add(cleanFields);}if(skipHeaders)allFields.remove(0);returnallFields;}}

So now we’ve got the back end code that is required to both get the data and parse it (Don’t forget to add a remote site exception in your Salesforce security controls for docs.google.com!). Now we just need an interface to use that data and display it in a nifty chart. Using highcharts this is pretty easy. Mine ended up looking something like this (You don’t have to tell me the code is kind of sloppy, this was just a quick throw together project).

<apex:pagecontroller="RightChartController"sidebar="false"showHeader="false"standardStylesheets="false"><script src="//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script><script src="https://code.highcharts.com/highcharts.js"></script><script src="https://code.highcharts.com/highcharts-3d.js"></script><script>//load the document source locally incase we want to let the user change it or something latervardocSource='{!dataSourceUrl}';varchart;//fetches the data from the google sheetfunctiongetData(docSource,callback){Visualforce.remoting.Manager.invokeAction('{!$RemoteAction.RightChartController.importCSV}',docSource,function(result,event){if(event.status){callback(result);}},{escape:true});}//massages the data from being an array of arrays (one line per form entry) into an array of objects with totals//should probably be refactored to make it more efficient, but whatever.functiontranslateDataToHighChartFormat(csvData){varchartData=newArray();vartotals=newObject();for(vari=0;i<csvData.length;i++){vartimestamp=csvData[i][0];varname=csvData[i][1];if(totals.hasOwnProperty(name)){totals[name]++;}else{totals[name]=1;}}for(keyintotals){varthisPoint=newObject();thisPoint.name=key;thisPoint.y=totals[key];chartData.push(thisPoint);}returnchartData;}//create the chart on document load$(function(){chart=newHighcharts.Chart({chart:{type:'pie',options3d:{enabled:true,alpha:45,beta:0,},renderTo:'container'},title:{text:'Told You So'},plotOptions:{pie:{depth:25}},series:[{data:[]}]});//set interval timer to poll the document every 10 secondssetInterval(function(){getData(docSource,function(result){chart.series[0].setData(translateDataToHighChartFormat(result));});},10000);//get the data one initially so we don't have to wait for the first delay to get datagetData(docSource,function(result){chart.series[0].setData(translateDataToHighChartFormat(result));$('#Loading').hide();});});</script><divid="container"style="height: 400px"></div><divid="Loading"style="text-align:center; font-weight:bold; font-size: 24px">Loading Chart Data Please Wait</div></apex:page>

If everything has gone smoothly, you should end up with something that looks like this

With our page alive, it’s a simple matter to add it to a Salesforce site. Anyone can view it, and anyone you give the form link to will be able to add data to it. As data is added the chart will automatically redraw itself every 10 seconds with the new data set. Then it was just a simple matter of having the chart open on some computer and using the chrometab app for chrome to send it to my chromecast. Now we can be reminded of how stupid I am all the time….. what have I done?

NOTE: If you don’t’ want to read the wall of text/synopsis/description just scroll to the bottom. The function you need is there.

I feel dirty. This is the grossest hack I have had to write in a while, but it is also too useful not to share (I think). Salesforce did us an awesome favor by introducing the JSON.serialize utility, it can take any object and serialize it into JSON which is great! The only problem is that you have no control over the output JSON, the method takes no params except for the source object. Normally this wouldn’t be a big deal, I mean there isn’t a lot to customize about JSON usually, it just is what it is. There is however one case when you may want to control the output, and that is in the case of nulls. You see most of the time when you are sending JSON to a remote service, if you have a param specified as null, it will just skip over it as it should. Some of the stupider APIs try and process that null as if it were a value. This is especially annoying when the API has optional parameters and you are using a language like Apex which being strongly types makes it very difficult to modify an object during run time to remove a property. For example, say I am ordering a pizza, via some kind of awesome pizza ordering API. The API might take a size, some toppings, and a desired delivery time (for future deliveries). Their API documentation states that delivery time is an optional param, and if not specified it will be delivered as soon as possible, which is nice. So I write my little class in apex

Which in would work beautifully, unless the Pizza API is setup to treat any present key in the JSON object as an actual value, which in that case would be null. The API would freak out saying that null isn’t a valid datetime, and you are yelling at the screen trying to figure out why the stupid API can’t figure out that if an optional param has a null value, to just skip it instead of trying to evaluate it.

Now in this little example you could easily work around the issue by just specifying the prefferedDeliveryTime as the current date time if the user didn’t pass one in. Not a big deal. However, what if there was not a valid default value to use? In my recent problem there is an optional account number I can pass in to the API. If I pass it in, it uses that. If I don’t, it uses the account number setup in the system. So while I want to support the ability to pass in an account number, if the user doesn’t enter one my app will blow up because when the API encounters a null value for that optional param it explodes. I can’t not have a property for the account number because I might need it, but including it as a null (the user just wants to use the default, which Salesforce has no idea what is) makes the API fail. Ok, whew, so now hopefully we all understand the problem. Now what the hell do we do about it?

While trying to solve this, I explored a few different options. At first I thought of deserialize the JSON object back into a generic object (map<string,object>) and check for nulls in any of the key/value pairs, remove them then serialize the result. This failed due to difficulties with detecting the type of object the value was (tons of ‘unable to convert list<any> to map<string,object> errors that I wasn’t’ able to resolve). Of course you also have the recursion issue since you’ll need to look at every element in the entire object which could be infinity deep/complex so that adds another layer of complexity. Not impossible, but probably not super efficient and I couldn’t even get it to work. Best of luck if anyone else tries.

The next solution I investigated was trying to write my own custom JSON generator that would just not put nulls in the object in the first place. This too quickly fell apart, because I needed a generic function that could take string or object (not both, just a generic thing of some kind) and turn it into JSON, since this function would have to be used to strip nulls from about 15 different API calls. I didn’t look super hard at this because all the code I saw looked really messy and I just didn’t like it.

My solution that I finally decided to go for, while gross, dirty, hackish and probably earned me a spot in programmer hell is also simple and efficient. Once I remembered that JSON is just a string, and can be manipulated as such, I started thinking about maybe using regex (yes I am aware when you solve one problem with regex now you have two) to just strip out nulls. Of course then you have to worry about cleaning up syntax (extra commas, commas against braces, etc) when just just rip elements out of the JSON string, but I think I’ve got a little function here that will do the job, at least until salesforce offeres a ‘Don’t serialize nulls’ option in their JSON serializer.

publicstaticstringstripJsonNulls(stringJsonString){if(JsonString!=null){JsonString=JsonString.replaceAll('\"[^\"]*\":null','');//basic removeal of null valuesJsonString=JsonString.replaceAll(',{2,}',',');//remove duplicate/multiple commasJsonString=JsonString.replace('{,','{');//prevent opening brace from having a comma after itJsonString=JsonString.replace(',}','}');//prevent closing brace from having a comma before itJsonString=JsonString.replace('[,','[');//prevent opening bracket from having a comma after itJsonString=JsonString.replace(',]',']');//prevent closing bracket from having a comma before it}returnJsonString;}

Which after running on our previously generated JSON we get

{“toppings”:[“cheese”,”black olives”,”jalepenos”],”size”:”large”}

Notice, no null prefferedDeliveryTime key. It’s not null, its just non existent. So there you have it, 6 lines of find and replace to remove nulls from your JSON object. Yes, you could combine them and probably make it a tad more efficient. I went for readability here. So sue me. Anyway, hope this helps someone out there, and if you end up using this, I’m sure I’ll see you in programmer hell at some point. Also, if anyone can make my initial idea of recursively spidering the JSON object and rebuilding it as a map of <string,object> without the nulls, I’d be most impressed.

So I know it has been a while. I’m not dead I promise, just busy. Busy with trying to keep about a thousand orgs in sync, pushing code changes, layout changes, all kinds of junk from one source org to a ton of other orgs. I know you are saying ‘just use managed packages, or change sets’. Manages packages can be risky early in the dev process because you usually can’t remove components and things and you get locked into a bit of a structure that you might not quite be settled on. Change sets are great, but many of these orgs are not linked, they are completely disparate for different clients. Over the course of the last month or two it’s become apparant that just shuffling data around in Eclipse wasn’t going to do it anymore. I was going to have to break into using ANT and the Salesforce migration tool.

For those unaware, ANT is some kind of magical command line tool that is used by the Salesforce migration tool (or maybe vice versa, not really sure the relationship there) but when they work together it allows you to script deployments which can be pretty useful. Normally though, trying to actually setup the deployment with ANT is a huge pain in the butt because you have to be modifying XML files, setting up build files and stuff, in general it’s kind of slow to do. However, if you could write a script to write the needed files by the deployment script, now that would be handy. That is where this tool I wrote comes in. Now don’t get me wrong, it’s nothing fancy. It just helps make generating deployments a little easier. What it does is allows you to specify a list of orgs and their credentials that you want to deploy to. In the deploy folder you place the package.xml file that contains the definitions of what you want to deploy, and the meta data itself (classes, triggers, objects, etc). Then when you run the program one by one it will log into each org, back it up, then deploy your package contents. It’s a nice set it and forget it way of deploying to numerous orgs in one go.

So here is what we are going to do, first of all, you are going to need to make sure you have a Java Runtime Enviornment (JRE), and the Java Developers Kit (JDK) Installed. Make sure to set your JAVA_HOME environment variable path to wherever the JDK library is installed (for me it was C:\Program Files\Java\jdk1.8.0_05). Then grab ANT and follow it’s guide for install. Then grab the Force.com migration tool and get that installed in your ANT setup. Then last, grab my SF Deploy Tool from bitbucket (https://Daniel_Llewellyn@bitbucket.org/Daniel_Llewellyn/sf-deploy-tool.git)

Now we have all the tools we need to deploy some components, but we don’t have anything to deploy, and we haven’t setup who we are going to deploy it to. So lets use Eclipse to grab our deploy-able contents and generate our package.xml file (which contains the list of stuff to deploy). Fire up Eclipse and create a new project. For the project contents, select whatever you want to deploy to your target orgs. This is why using a package is useful because it simplifies this process. Let the IDE download all the files for your project then navigate to the project contents folder on your computer. Copy everything inside the src folder, including that package.xml file. Then paste it into the deploy folder of my SF deploy tool. This is the payload that will be pushed to your orgs.

The last step in our setup is to tell the deploy tool which orgs to push this content into. Open the orgs.txt file in the SF Deployer folder and enter the required information. One org per line. Each org requires a username, password, token, url and name attribute, separated by semincolons with an equal sign used to denote the key/value. EX

Now with all your credentials saved, you can run the SalesforceMultiDeploy.exe utility. It will one by one iterate over each org, back up the org, the deploy your changes. The console window will keep you informed of it’s progress as it goes and let you know when it’s all done. Of course this process is still subject to all the normal deploy problems you can encounter, but if everything in the target orgs is prepared to accept your deployment package, this can make life much easier. You could for example write another small script that copies the content from your source org at the end of each week, slaps it into the deploy folder, then invokes the deployment script to have an automated process that keeps your orgs in sync.

Also I just threw this tool together quickly and would love some feedback. So either fork it and change it, or just give me ideas and I’ll do my best to implement them (one thing I really want to do is make this multi threaded so that it can do deployments in parallel instead of serial, which would be a huge bonus for deployment speeds). Anyway as always, I hope this is useful, and I’ll catch ya next time.