Like I often do, I hit the mouse too quickly with this one. I had an idea, sketched a few ideas and then got straight into designing in Flash. I guess I’m just used to doing this & need to stop it. As I went further down the line, spawned better ideas & discovered the complexities of the interface, I needed to go back and redesign graphic elements – I could have avoided this if I had planned on paper properly first.

I was actually quite surprised at how quick and easy Keynote is as a prototyping tool. It will also run natively on on iPad using the mobile Keynote version. This was good to test on and got the point across well. With the simple transitions in between screens it felt like a real app – and yet confusing when something didn’t work – because it looked like it should work. My feeling is that prototypes should possibly just look like prototypes and not a finished product – it did confuse users. Or maybe I didn’t put enough into the prototype. Where do you draw the line on a prototype?

I took my 3 scenarios & assembled storyboards in Sketch. I then did a .png export of all screens into a folder. These images were then simply dragged into Keynote. I drew on hotspots where I needed them & this has seemed to work quite well. It’s very basic but enough to get a prototype idea across – it would have been nice to have created some more complexity here – at present, the user needs to follow the task scenarios with the prototype (that is, follow the steps in the scenario) as building all possible clicks and interactions would have been extremely time-consuming and a lot of screens. This exercise did highlight problems that were not apparent on the paper prototypes however – size of buttons, aesthetics and where menu items were located for example.

I had a scour through some tracks I had written today to try and find something that would be suitable for the interface prototype. Using Ableton Live, I selected a few tracks and rendered them off as .aiff loops. I had to make sure all my volumes and effect levels were all correct here as Keynote will play all audio files at the same (master) volume.

Another quick post but essential. Apple’s “status bar” is a tricky one. It is the black band that runs along the top of screens with carrier / signal info, time and battery life. When designing applications, do we allow for this? After looking into Apple’s Human Interface Guidelines, I found that you can get rid of it but Apple recommend not doing so. Unless it is a fullscreen application. Is Cyclo a fullscreen application? My gut says yes it is fullscreen. So, for moment I’m going with it not being there. A better scenario could be that it is there when viewing the interface but disappears when pressing “play”. Pressing “stop” would then return the status bar. I can see why users may need to check battery life, time etc but when in “play” mode need fullscreen focus. It would be possible, and easy to test I guess.

I was pondering today where the best place to put this is. My interface has quite a few global preferences that effect display of the interface – primarily as they assist with learnability and give a degree of customisation. At first I put in the main menu but it felt a bit “far away” there so I have moved to the audio object bar / menu where it seems more dedicated. Could be either I guess and looking at this again I probably have too many choices here & should probably constrain and simplify somewhat.

Quick post but an interesting one. When creating iOS apps remember that there is no “QUIT” or “EXIT” button. It’s weird to think like this but they do not exist in Apple’s iOS world. Apple’s interface guidelines suggest that users will simply press the HOME button on the phone or iPad when they are ready to quit.

I redesigned the entire interface today – well, not exactly a complete redesign but all menu elements needed resizing. Apple’s guidelines suggest 42px hit areas & mine were sitting at 32px (on the main menu), this seemed quite large to me & I thought it would be fine. When viewed on an ipad however, as the pixel density is different (see previous post) it shrinks the entire interface somewhat. Everything looks really sharp but having a “test-clicking” session with a friend revealed that these hit areas would indeed be too small.

I’ve been fighting with the design of this interface over the last few days. The process has become incredibly complex with designing interface elements in Flash, spitting out png’s and then re-assembling in Sketch. Why do this? Well, Sketch has “artboards” or canvases as I call them – most applications only allow one artboard or canvas at a time but with Sketch you can have as many as you want. With Sketch, you can mockup all of your different design states on different artboards. This is huge and allows you to see the flow of interactions purely by looking at your rows of artboards. It has a very similar feel to storyboarding.