On my current project, we needed a way to check a user set preference before taking action on behalf of the user. To be specific, we wanted to check if the user prefers for us to post an open graph action to facebook when they favorite a meme on our site. The trick is the user preferences are stored in our profile database and all of our open graph work is purely client side.

In our open graph module, we really didn’t want to care how the user preferences were stored. We simply wanted to consume them in a clean way. An example looks like this:

This is the click handler for a favorite button. We pass in the url to the meme the user favorited and construct an options object. The options object defines data associated with the preference as well as a function to perform if the user allows the action. We also include a function to execute if the preference is not currently set. This way we can prompt the user to make a preference. Finally, we call the preferences module with the preference in question and the options.

Deep in the bowels of our preferences module, is the userPrefers method. It looks like this.

This function calls withCurrentPreferences and passes in a function describing what to do with a set of current preferences. We check to see if the preference we are checking is enabled and call the allowed method passing along the data if it is. Finally, it check is the preference is explicitly null and calls the unknown method if it is.

So far fairly clear and concise. But what magic is this withCurrentPreferences method?

The method takes an action to execute with preferences and attempts to read a locally stored preference cookie. We cache preferences locally to not bombard our app servers with unneeded calls. If the cookie based preference exists, we simply call the action passing along the preference. If not, we call getPreferences passing along the action. Finally the getPreferences function makes a ajax call out to our app server to get the preferences. On success it saves a preference cookie and if an action was passed in it calls it.

And there you have it a nice clean asynchronous method of taking actions based on a users preference that is managed completely client side and it uses a local caching mechanism to make it zippy.

So I signed up and my hopes were immediately dashed when I discovered they are metering access to their beta. You have to camp out on their activation site waiting for them to allot a few more activations. This sounded boring. So I decided to automate it with node of course. I fired up Sublime Text 2 and ripped this out.

Yes this script hits the website every 10 seconds checking to see if the limit message is not on the page and play a message if it is not. I was sufficiently amused by this that I gisted it and posted to twitter. Funny thing is with in a minute I had been retweeted by Joshua Holbrook the support lead for Nodejitsu and got the following response from NodeKohai the IRC bot for the Nodejistu channel.

@NotMyself Very nice! Now come join ‪#nodejitsu‬ on freenode to claim your prize!

You see sometimes being a smart ass is a bonus. It gets you free things! Also here is a quick video showing the script in action.

Over the weekend I started building my first real node.js application. I had watched the Hello Node series from TekPub, read the LeanPub books and attended NodePDX this year. I was ready to get down in the weeds and start writing a real application.

I also have been wanting to try to connect into the local non-.NET community in Olympia, not that I ever see my day job not involving .NET, but I am interested in learning different ecosystems, languages and frameworks I think it makes me a more well rounded better developer in the long run. So I started a meetup group for Olympia, WA Node users and beginners.

My idea for a node app, was to create a site that consumes the meetup api and displays upcoming meetings. Fairly, simple. You can see the result of my weekend worth of work here. The site is a simple twitter bootstrap based single page that has a carousel widget displaying the upcoming meetings, currently only one scheduled.

You can see meetup specific api data including the number of members who have said they were attending, the location and a google map link, as well as a date and time. I was pretty happy with myself and blasted the link out to the world via twitter and facebook. Little did I know I had missed something in the details, which Chris Bilson was so kind to point out. The date being displayed on the site said the meeting was being held at 1:30 AM.

The meetup api returns an event object containing two bits of information related to the event’s date and time, time and utc_offset. The time is based on milliseconds since the UNIX Epoch. And the utc_offset is milliseconds based as well. Because I was in full on cowboy mode coding up a storm, my initial implementation of prettifying the date looked like this with no tests.

This node module uses the awesome Moment module to parse a UNIX Epoch number into a date and then format it using standard date formatting. This worked awesomely on my local machine. So I didn’t think about it any more and moved on, until Chris chimed in.

Chris had suggested that it might have something to do with UTC. I was also a little embarrassed that I didn’t have such a simple thing under unit test. So I started fixing the bug by getting the code under test. I had a couple well known values for the currently scheduled meeting.

The interesting thing here is the test passed with out modifying the implementation code at all. You see Moment automatically sets an offset based on the current environment. So if I were able to run this test on Heroku the test would fail. I was a bit stumped and came back around to my sad little cowboy ways and modified the implementation like this.

I was grasping at straws, but this modification didn’t effect the test running locally. I was curious what would happen when running the site on Heroku. I suspected I would have the same issue. I was very surprised to see that the code worked.

The downside was that I didn’t understand why and that bugs the crap out of me. I couldn’t let it go. Getting the code to work was not enough for me, I needed to understand why. So I started googling. I lucked out and found this blog post on Adevia Software’s blog.

It clicked for me after that. The reason the test for the new code passed locally and the code worked on Heroku all had to do with the time zone settings of the environment running the code. My local environment is set to PST, so taking a Unix Epoch based date parsing it with Moment gives a PST date, which is then converted to UTC and then reduced by a PST UTC Offset resulting in the original PST date created by Moment.

Heroku’s default apparently is UTC. Apply the same logic and you end up with a UTC date that has been reduced by 8 hours that is still a UTC date. It looks right on Heroku because my pretty printer doesn’t include the timezone. If it had it would be wrong.

Once again I understood how the code worked and it was working, but it was wrong. The nag in the back of my head would not let it go. It’s a bug, bugs must die. Now that I understood what was going on, I went back and reverted by helper back to this implementation.

At Cheezburger, we make use of require.js for most of our client side javascript. Recently I had to implement some features that needed to pull lots of 3rd party scripts that were not AMD compliant. The documentation of course told me to put script tags directly in the head of every page, which I have learned recently is a blocking operation (one of the problems that require.js solves cleanly).

So, I took some time and came up with a simple asynchronous dependency loader for this situation.

I have recently completed my first iOS application on behalf of Furnishly, the local furniture exchange. All the release bugs have been worked out and the app is available on the app store. Yesterday, I started looking into how to transfer all the assets over to the owner of Furnishly so he could continue development and review download data on iTunes connect.

On github, this was a snap. Simply go into the administration section for the private repository, scroll down to the Danger Zone and click the transfer button under transfer ownership. Then type in the name of the repository and the new owners user name and hit transfer. Simple, easy.

In iTunes connect, it is a completely different story. There is no obvious way to transfer the application in the UI. Searching around in the FAQ surfaced this gem.

I sold my app to another developer and can no longer distribute it on the App Store. Can I transfer the app to the new developer’s iTunes Connect account?
No, you can’t transfer the app to another developer account on iTunes Connect. To add the app to another account, remove the app from the current account and upload it to the new iTunes Connect account.

Note that uploading the app to a new iTunes Connect account will disable current customers from receiving automatic and free updates of your application. All customer reviews, rating, and ranking information will be reset. You will not be able to reuse the app name and SKU in the old account. If you have uploaded a binary or used the app with the iAd Network, your Bundle ID will not be reusable either.

So apparently the way to transfer ownership of this app to the non technical owner is to:

ask him to create a apple developer account

wait to get accepted

generate new application keys

rebuild the application with the new keys

delete the old application build from my account

resubmit the new application via his account

Oh and all the folks that have downloaded the app in the mean time from my account are pretty much never going to get an update. All the ratings you might have received will disappear.

Seriously this is a horrible way to handle what seems to me would be a common occurrence. Did Zinga have to follow this process when they bought DrawSomething?

I ran into an issue this week where I was attempting to load data from a web service asynchronously using System.Threading.Tasks on MonoTouch. I was able to fire the task off but kept getting an error trying to update UI elements when callback was fired.

After a little beating my head against a wall, I took a walk and grew a neuron and this is what I came up with to resolve the issue. Note the call to InvokeOnMainThread.

This makes things nice for testing purposes as these value structs are comparable for free and are clearly named what the value represents. The downside is if you need to set up a bunch of test data for a unit test you can run into code that looks like this:

Name name = new Name("Name");
IEnumerable<Name> names = new[] { new Name("Tom"), new Name("Dick"), new Name("Harry"), }

This will quickly give you carpal tunnel with all the ceremony required to create all the instances. It would be nice if we could reduce some of the noise and we can via use of the implicit operator. All we need to do is add the following operator logic to our struct.

We are used to using implicit typing from on the left hand side of a statement, were you aware that it can be used on the right hand side? I wasn’t. But basically this is how collection initializers work. Nice eh?

Thanks to Robert Ream for showing me this. It was fun working with someone with such a deep understanding of the language and functional development, even if it was a brief time.

I recently updated my work Virtual Machine to the latest release of msysgit 1.7.9 to resolve some issues I was having with global settings not being obeyed. After the installation I noticed that I was no longer able to update repositories from PowerShell. The output I was getting looked something like this:

This was unexpected and the first thing I thought of was the recent security issue with GitHub, and maybe my work key needed to be validated. I checked GitHub and everything seemed to be set up correctly. I even went so far as to generate new keys with no success.

So it looks like the problem was not with git but with establishing an ssh connection to GitHub. I wanted to see exactly what was happening when trying to connect via ssh, so I ran the following command with enables verbose logging of the connection.

This output did not give me any immediate ideas on the problem but I thought I might try the same command from git bash. I won’t include the full output here, but I did notice something different right away. Check out the following lines from the output. Compare them to lines 6-8 above.

So it looks like ssh running under PowerShell is looking for my public/private key pair in a different directory than under bash. Doing a quick google search I found that an environment variable named home is used when determining the path to look for keys. I went back to PowerShell and checked for the environment variable like so.

Overview

If I told you that you can build node.js applications in Windows Azure would you believe me? Come to this session and I’ll show you how. You’ll see how take those existing node apps and easily deploy them to Windows Azure from any platform. You’ll see how you can make yours node apps more robust by leveraging Azure services like storage and service bus, all of which are available in our new “azure” npm module. You’ll also see how to take advantage of cool tools like socket.io for WebSockets, node-inspector for debugging and Cloud9 for an awesome online development experience.

About Glenn

Glenn is a PM at Microsoft working on support for node.js in Windows and Azure. Glenn has a breadth of experience both both inside and outside Microsoft developing software solutions for ISVs and the enterprise. Glenn has been a passionate supporter of open source and has been active in involving folks from the community in the development of software at Microsoft. This has included shipping products under open source licenses, as well as assisting other teams looking to do so. Glenn is also a lover of community and a frequent speaker at local and international events and user groups.

Glenn’s blog can be found on CodeBetter or you can follow him on twitter at you own risk.

The South Sound .NET Users group is proud to present Glenn Block on Thursday March tth at 7:00PM at the Olympia Center in the heart of downtown Olympia, WA.

If I told you that you can build node.js applications in Windows Azure would you believe me? Come to this session and I’ll show you how. You’ll see how take those existing node apps and easily deploy them to Windows Azure from any platform. You’ll see how you can make yours node apps more robust by leveraging Azure services like storage and service bus, all of which are available in our new “azure” npm module. You’ll also see how to take advantage of cool tools like socket.io for WebSockets, node-inspector for debugging and Cloud9 for an awesome online development experience.

Glenn is a PM at Microsoft working on support for node.js in Windows and Azure. Glenn has a breadth of experience both both inside and outside Microsoft developing software solutions for ISVs and the enterprise. Glenn has been a passionate supporter of open source and has been active in involving folks from the community in the development of software at Microsoft. This has included shipping products under open source licenses, as well as assisting other teams looking to do so. Glenn is also a lover of community and a frequent speaker at local and international events and user groups.

Glenn’s blog can be found on msdn or you can follow him on twitter at you own risk.