Despite the frustrations experienced by many Mac developers, I decided to release on the Mac App Store. I think it’s still the best way for people to find and buy new apps, plus I don’t have any payment infrastructure set up for myself and the App Store makes this very easy.

Funtionality-wise the Mac version pretty similar to the iOS version. It has a few additional moods, and the ability to play back an entire progression. It requires El Capitan[1] and supports full-screen and side-by-side modes.

While Autochords on iOS is ad-supported (with in-app-purchase to remove ads), I decided to try selling the Mac version for actual money. I think it’s quite low-priced for a Mac app at US$4.99, but I’m considering increasing the price as I add more advanced features.

For the two days since launching, Autochords has been the first app under All Music Apps, and it seems to have picked up a few purchases from there. With the low volume of new apps on the Mac App Store, maybe it will stick around near the top of that list for a while.

I don’t think I’m using any APIs that would prevent targeting an older OS, so I might fire up a VM and do some compatibility testing. ↩

I’ve been on a more rapid release schedule with Autochords for iOS lately, which means writing more release notes…

I don’t nag for reviews in the app or even have a feedback/review link anywhere[1], so I thought I’d try asking for reviews in the release notes. I added the phrase, “If Autochords has helped you write a song or you had a good jam with it, why not leave a nice little review? :)” to the release notes to see if I could get some new reviews. It worked quite well, with about 6 people writing reviews and all of them positive. Previously I’d mostly only get reviews if people ran in to bugs, so that’s certainly been an improvement.

In a later update I also said: “Email hey@autochords.com if you have any suggestions for new progressions!”. This has only given me 1 email so far, but it was actually a really good suggestion so I’m counting that as another win. 😁

Ironically, I got an email requesting this recently so I probably will add some feedback and App Store review links to the app soon. I still won’t be a dick about it though. ↩

I recently released an update to Autochords which removed a bunch of dependencies and frameworks, including Crashlytics.

Crashlytics is really great, but it’s just one of those things that I’d rather not have to rely on. Turns out Xcode includes a pretty neat way to view crash logs for your beta and released apps. If you haven’t looked at Apple’s crash reporting in a while, it’s definitely worth another look.

To check it out, from Xcode go to Window → Organizer, then the Crashes tab. Your apps will be listed on the left, and you’ll be able to select a particular build to view its crashes. I’m fairly certain this only works well when you’ve elected to include a dSYM as part of the app upload process.

So it hasn’t got the delightful animations of Crashlytics, and marking an issue as resolved doesn’t give you that satisfying rubber stamp effect, but it’s really working well for me.

A good/bad thing is that it’s controlled by a user’s system-wide preference for sending diagnostic information to developers. I don’t know what the opt-in rate is, but it does make me feel good to not have to bug my users with an “Oops we crashed” message and I know that I’m respecting my users’ privacy preferences.

I’ve been converting the main posts view in Pinpoint from a table view to a collection view. Unfortunately, I found that it was stuttering even on my brand-new, ridiculously fast iPhone 6s Plus.

The cells have background images, buttons, and gradients, so I thought I’d need to optimize those. But when I spun up Instruments and had a look at the CPU usage I found that a lot of main thread CPU time was spent on layout.

Problem

When slowly scrolling through the collection view, I could see spikes like this when each row displayed. Something was taking too much time to prepare those collection view cells. Probably my collectionView:cellForItemAtIndexPath:, right?

Focusing in on one of those spikes, I could see that the cell re-use looked okay at only 4.1% of the CPU time. But what about that deep stack of Auto Layout calls? Something called -[UICollectionView _checkForPreferredAttributesInView:originalAttributes:] was taking up a lot of CPU time.

When I’m planning out the user experience of an app, I make sure that the most basic features are immediately obvious. Often there will be more advanced features that I want to include without detracting from the core experience. These are the kinds of things you might hide in a contextual menu or keyboard shortcut on a desktop/web app.

Advanced functionality in a touch interface might be behind a long-press, swipe, double-tap, zoom-flip-rotate-double-finger-triple-tap… whatever it is, users need to realise that the functionality actually exists in the first place before they think to look for it.

Subtle indications about how people should expect an interaction to work are difficult to get right. I think Apple’s best example of this is the camera on the lock screen. Tap it and the lock screen bounces as an indication that you can slide it up to reveal the camera app. Unfortunately even with this example I’ve seen people who don’t take that bounce hint to mean ‘you can slide me up’.

There are plenty of other elements that people have learned though, like scroll bars and horizontal page indicators.

If our subtle indications aren’t obvious enough, in-app help sounds like a cop-out, but I think it’s actually the best solution for exposing advanced functionality to users. I hope that people will seek it out once they understand the basics of my app. Of course, I could be wrong and totally overestimating people’s willingness to seek help before leaving a 1-star review.

To give a concrete example, with Autochords I weighed up the importance of the main functionality that this (admittedly simple) app provides.

View chord progressions.

Select a progression style and musical key.

View alternative progressions.

See how to play a chord.

Hear how a chord sounds.

Play back an entire chord progression.

And that’s not even everything. I tucked some more advanced, less critical options away in in-app settings, like the dark mode option. It’s tricky to know what is most important to most of your users – the ranking I went with is definitely not the same for everyone.

Here’s the interaction required for each.

View progressions.

Simply launch the app.

Select a progression style and musical key.

Tap the buttons in the bottom-left and bottom-right corners.

OR just tap the shuffle button in the top-left.

View alternative progressions.

Swipe on the progression in the middle of the screen.

I’d consider this somewhat advanced because it requires people to realise that the 3 dots page indicator means they can swipe. Sure, they’re probably familiar with it from their home screen, but you never know.

See how to play a chord.

Tap on a chord.

Hear how a chord sounds.

Tap on the chord diagram after tapping on a chord.

OR long press on a chord.

Play back an entire chord progression.

Long press on a chord and move your finger across to other chords.

This is definitely the most unintuitive of the features listed here.

The issue that sparked this post is #6. It’s not intuitive enough. I’m thinking that some in-app help is going to be the best solution. I’m also really excited about app preview videos in iOS 8. With a few captions and hints about what functionality is possible, I hope that people will be able to discover these somewhat hidden features in apps.

I had some trouble understanding exactly how to get a split view controller to have a toggling master view controller. The problem was that I assumed I needed to manage my own UIBarButtonItem that calls a method on the split view controller.

There are a few pieces I had to put together to get the functionality I wanted.

First, I had to set my split view controller’s preferredDisplayMode to UISplitViewControllerDisplayModeAllVisible. This displays the master on the left, detail on the right.

Next I needed a button toggle the master view controller.

After re-watching WWDC session 204 (text version here), I realised that the UISplitViewController actually provides you with the button through its displayModeButtonItem.

So basically if you put the button in a toolbar or navigation bar, its state, target, and action are going to be managed for you.

I set the toolbar button when the detail view controller gets set. In my storyboard-based app it looks like this:

The last piece of the puzzle was getting the toggle action set up properly. Since the appearance of the button that the split view controller provides depends on what will happen when you press it, you need to specify what will happen. You do this in the UISplitViewControllerDelegate’s targetDisplayModeForActionInSplitViewController:. For me, that looks like this:

Popovers are finally(!) possible on iPhone on iOS 8. The catch is that by default, they get presented modally over the full screen — not exactly what I’d call a ‘popover’.

Thankfully it’s easy to force the presentation to be an actual popover. UIPopoverController is pretty much a thing of the past. With iOS 8 you just present a view controller with a popover presentation style which is a new Adaptive Segue available in Xcode 6.