I’ve been getting busy with Final Cut Pro X and created more videos from talks done at our local Cocoaheads meetups. These videos are now up on the Melbourne Cocoaheads Vimeo group (and embedded below).

Well macros are just that, macros. They generate code. They substitute the text of the macro parameters and emit the body into your source. Unlike methods or functions, they do not evaluate their parameters before passing the result to the macro. So the macro above when called the second way will produce the following code:

Not quite what we wanted right. We’re leaking an NSNumber instance and creating two which are auto-released and we’ll probably have a zombie object stored in propName_ after the auto-release pool is drained. A nice confusing bug for us to stumble upon.

The line __typeof__(newValue) __A = (newValue); forces the passed parameter to be evaluated and a result stored in the temporary __A. We can then use the temporary variable multiple times within the macro body without fear.

Apple’s UISearchDisplayController is a handy wrapper around searching for UITableViewControllers. UISearchDisplayController will send its delegate the following message whenever a user types something into the search bar:

It is in this method that the delegate should perform a search of its data and based on whether the search results have or would change return the value YES or NO.

This all works fine and dandy when the objects to search are in memory and the delegate method executs promptly. If this is not the case then the UI will block while the search logic is performed. This can create significant lag in the user interface as the user is typing into the search field.

In the iOS API documentation for searchDisplayController:shouldReloadTableForSearchString: Apple provides a hint about performing search work in the background but doesn’t really go into any detail. The docs say:

You might implement this method if you want to perform an asynchronous search. You would initiate the search in this method, then return NO. You would reload the table when you have results.

An approach to doing an asynchronous search might be something like this:

This implementation gets you some of the way there. The problem with it however is that as the user interacts with the UISearchBar the searchString changes and with each character typed another invocation of searchDisplayController:shouldReloadTableForSearchString: is performed. This causes lots of work to be enqueued into the NSOperationQueue. Work that must get executed in sequence.

This is not ideal. The user is only interested in the most recent search string that they have entered. If they enter the string “fubar” into the UISearchBar they don’t care about the searches for f, fu, fub, and fuba.

OK. So how do you properly implement this? Well, it is surprisingly easy. Just cancel all the existing operations in the NSOperationQueue before adding your latest search operation to the queue.

[self.searchQueue cancelAllOperations];

Any yet to execute operations will be cancelled and never execute. Any presently executing operation will run to completion. In a more complex situation, perhaps with a multistep NSOperation, (where you were using a custom subclass of NSOperation or something) you could check for isCanceled on NSOperaton instance and bail out of your operation early. I’ll leave that up to you to figure out.

On Mac OS X Git uses the opendiff(1) command line utility as its merge tool by default.

opendiff(1) will launch the FileMerge app that comes with XCode when performing merges. Unfortunately FileMerge is not a very good merge tool.

To improve this, there are a number of options. There are some good commercial, Open Source and freeware diff tools out there for Mac OS X such as DiffMerge, SureMerge, Araxis Merge, K3Diff, and p4merge, etc. Too many to list them all here.

For my system I chose p4merge. p4merge is part of the Perforce source control management system. I’ve used Perforce quite a lot in the past at previous jobs and it has great tools. Perforce is commercial software, and I’m not about to use Perforce as my source control system instead of Git, but fortunately you can download and use the Perforce GUI tools for free. The Perforce merge tool is not tied in any way to the Perforce SCM system so you can use it as a stand alone tool.

Now you may be wondering why I didn’t just set my Git merge.tool config setting to p4merge, as it is supported “by default” by Git. Well, Git expects the command p4merge to be in the $PATH and I’d rather not have to install shell scripts across the different Mac systems I use so that it works “out of the box”. I also found p4merge wasn’t dealing well with the relative paths that Git was trying to pass to it, hence the custom mergetool.custom.cmd setting that uses $PWD.

Lately I had become somewhat dissatisfied with the speed of my QNAP 509 Pro NAS. In particular with the speed of transfers to and from it. I was thinking about getting another one, or replacing the drives in and effort to speed it up.

But then I gave it some thought and figured I should probably first do some more testing and research.

QNAP advertise the NAS I have as being able to get a sustained 60 megabytes per second in file transfers over a single gigabit ethernet link. This is pretty respectable. However, I was only getting at most 12MB/s. At first I thought that my iMac not supporting “jumbo frames” could be the problem. But with a little more digging I discovered the QNAP doesn’t support jumbo frames either so that wasn’t going to help me anyway.

In an effort to rule out the iMac as the problem I hooked up my MacBook Pro, which is normally only connected via Wi-Fi, to the Apple Time Capsule that is serving as my gigabit ethernet switch and did some more testing. The MacBook Pro was having similarly bad performance transferring to and from the QNAP. In an effort to discover if data transfer performance was just generally bad or if it was just the QNAP that was suffering bad performance I tried transferring files between my MacBook Pro and my iMac. This is where things started to get interesting. My MacBook was happily transferring data to and from my iMac at around 60MB/s.

It was at this point that a silly thought popped into my mind. Perhaps the cable connecting the Time Capsule to the QNAP just wasn’t up to snuff when it came to gigabit ethernet. So I changed it, and things got better. Fast.

Afterwards I realised that the cable I had been using all this time, to connect my QNAP to the rest of the world, the cable that I had just randomly selected from the large collection I have, was a cable I made myself a long time ago when CAT5 and 100MBps networks were the norm. The cable I replaced it with was a professionally made, modern, CAT5E cable. One designed to work with gigabit ethernet.