iPhone

I’m pleased to announce the release of Poem Flow, an app I worked on in the Fall for TextTelevision. Poem Flow is an iPhone poetry e-reader, but it’s no ordinary reader. Lines of poetry are rendered as animations of movement and transitions.

The image to the left really doesn’t really capture the dynamic experience or reading the flow. At first I was a little skeptical of the concept, but I soon found myself more immersed in what I was reading, and thus had a more meaningful experience.

I was responsible for architecting the flow rendering engine. The dancing words and lines of each poem are meticulously choreographed beforehand with scripts, which in turn are translated into motion. I expect this platform to continue to evolve, and as you know with my work with Iyeoka, poetry has a special place in my heart.

Sir Tim Berners-Lee is a digital age living legend. He’s credited for inventing the World Wide Web where on Dec. 25, 1990 he and his staff at CERN executed the first successful communication between an HTTP client and server on the Internet, and the rest so to speak is history.

As the director of the World Wide Web Consortium (W3C), Berners-Lee is overseeing the evolution of the web, particularly in the direction of how information and services are defined. This falls under the auspices of what is known as Semantic Web, or more specifically Linked Data.

Why is this important? Presently web pages are designed to be read and navigated by humans. We take for granted how we surf the net and gather information depending on what our objectives are. Computers on their own can not easily accomplish this without human direction.

As we live in the information age, it makes sense that web evolve in such a way that computers have a context and understanding of what’s on a web page and in turn have an informational perspective of that page in relation to information across the web. By having this ability, software can be developed to automate tasks relating to gathering and categorizing information on the web in a meaningful way.

Sir Tim Berners-Lee is a digital age living legend. He’s credited for inventing the World Wide Web where on Dec. 25, 1990 he and his staff at CERN executed the first successful communication between an HTTP client and server on the Internet, and the rest so to speak is history.

As the director of the World Wide Web Consortium (W3C), Berners-Lee is overseeing the evolution of the web, particularly in the direction of how information and services are defined. This falls under the auspices of what is known as Semantic Web, or more specifically Linked Data.

Why is this important? Presently web pages are designed to be read and navigated by humans. We take for granted how we surf the net and gather information depending on what our objectives are. Computers on their own can not easily accomplish this without human direction. As we live in the information age, it makes sense that web evolve in such a way that computers have a context and understanding of what’s on a web page and in turn have an informational and associative perspective of that data.

In an effort to help define what applications can be developed leveraging Semantic Web, MIT conducted a week long Linked Data Product Development Lab which culminated on Jan. 19, 2010 in which competing teams demo their Linked Data applications. The presentations were conducted at the Sloan School and the judging panel included Berners-Lee along with representatives from Spark Capital and Charles River Ventures.

The team that I was on, was comprised of MIT alum Gladis Filchtiner and Zach Richardson from the University of Texas. Our project, named LocalFocus, is a development platform in which developers can create sophisticated Linked Data queries, which can then be deployed as modules to a mobile client, such as the iPhone. As part of our demo, we showed various different queries on the iPhone with real time results rendering onto Google Maps.

The coding experience for me was grueling, like Music Hack Day except four times longer! Zach and I didn’t get our client and server communicating properly until an hour before the demo! It was literally down to the wire. It’s really quite an honor for Zach and I to come in as guests to MIT, come under Gladis’ wings and win this competition, especially considering the strength of the seven other competing teams. This has given us the confidence to take LocalFocus to the next step and develop a business plan. As Zach is from Austin, we’re looking to pitch our ideas to investors at SxSW Interactive in March!

Special thanks are in order to the organizers of the event: K. Krasnow Waterman, Reed Sturtevant and Bill Aulet.

Music Hack Day has been a recent phenomenon where music software developers and enthusiasts from all around the world converge to exchange ideas and compete against each other in “hacking” together a software project in 24 hours. In 2009, these conferences have been in London, Berlin, Amsterdam and just recently in Boston. The event was hosted by Microsoft’s New England Research and Development Center, and organized by the tireless Jon Pierce, Paul Lamere and Dave Haynes.

Company participants at the conference read like a who’s who in the music software industry, a lot of whom are based out of Boston. This includes the likes of Harmonix, Noteflight, Tapulous, Sonos, Echo Nest, Last.fm, SoundCloud, NPR, TourFilter, Conduit Labs and Berklee Music Online. There were a couple dozen brief 25 minute API and platform workshops that participants could choose to attend.

Two companies that have been generating a lot of buzz at the conference were Echo Nest and Noteflight. Echo Nest has developed a platform that can analyze audio of a song and generate multitudes of music characteristics relating to rhythm, pitch and timbre. This has led to an emerging field of music informatics with broad applications ranging from determining statistical music analysis on what makes Coldplay popular to doing remixes of cross genre music that happen to have particular features that are similar. Noteflight does online music notation that is community oriented. Founded by my former Allurent colleague and Flex mentor Joe Berkovitz, Noteflight is paving the way with how users can contribute and share music scores online. It comes as no surprise to me that Noteflight is lauded for it’s incredibly intuitive user interface.

SoundCloud, the social networking music service that I use to showcase my music presented their API for account access and audio streaming, which of course got me very interested. They are based out of Berlin, and were just as excited as me about the prospects of me doing an iPhone app that showcases their audio API. So just like that, on 2pm last Saturday, I decided to take this on as my project. I would have to submit my project by the 3:45pm deadline the following day.

Things got off to a pretty rough start. A faulty sync cable ended up crashing my phone, rendering it inoperable. Fortunately, one of the SoundCloud developers loaned me his iPhone, while another developer tried to restore my iPhone. As I was studying their API, it was really convenient to consult them directly! I coded straight for the next 24 hours, except to sleep briefly from 11pm to 2:30am! Time was so tight that I was coding on the subway on my way home.

The app I developed, the Phanai SoundCloud iPhone app, showcases some of the music I’ve produced over the years, which can be viewed at SoundCloud here. In order to develop the iPhone version of this, I needed to use SoundCloud’s API to log into my account, iterate through all my tracks and get information such as the artist name, song name and album artwork. I also leveraged the iPhone’s touch screen to allow the user to scrub (seek) to any position of the song by swiping along an audio waveform that represents the recorded track. See above for actual screenshot of the app.

I’m happy to report that I won the iPhone category of the competition, as well as considered a finalist in the overall “Winning Hacks”. I walked away with a new iTouch, provided by Tapulous and a free online course at Berklee Music. Special thanks to Hannes, Johannes and Dave from SoundCloud and to the organizers of this event.

There’s no doubt that radio apps such as Pandora and LastFM are hot, but radio station aggregate apps have their place too, especially when you want to to tune into specific programming in particular parts of the world. I’m happy to announce my partnership Global International Radio Technologies, where I’m doing iPhone development for a streaming radio application called Grab Radio.

The first edition of this app will focus on radio station programming in Ireland that supports digital streaming. It will include features such as “Grabbing” where you could purchase a song you’re listening to on Grab Radio. You could also “Tag” or bookmark a song for purchase at a later point.

I’m particularly excited about working on a MapKit implementation where you could select radio stations that appear on a map, in this case that of Ireland. Eventually future editions could focus on other countries or geographical regions around the world.

It feels like it’s been a long time coming but I’m happy to report that the Andrew Swaine iPhone app is now live on iTunes here! As you may recall, this app was originally slated to be released in May, but a longer than expected approval time (over 3 weeks) and a pending OS3 release made me decide to pull it. I’ve made the OS updates and the approval time was a touch over a couple weeks this time around. The Andrew Swaine app is actually the first of a series of photography apps I’m working on now.

The time has come as developers from all corners of the globe are beginning to converge in San Francisco in advance of the Apple World Wide Developer Conference for 2009 from June 8-12. For one week, the Moscone Center will be the mecca of all things Apple as it relates to developing software on Apple platforms, which of course includes iPhone Cocoa Touch Development.

I’ve had the benefit of being here for almost a week already, hanging with friends and family in the Bay Area, and following through on a few connections here. There’s been a lot of hype leading up to the conference and I’m doing my darndest to not get too caught up with things, but already I’m reading tweets of attendees planning on getting in line at 4am for the 10am keynote, and Steve Jobs isn’t even going to be attendance! Is it me or is this nuts? Or am I running the risk of not being able to get into the standing room only hall if I don’t come early enough?

There’s plenty of technical issues I hope to get answers for in the next week as well as hopefully get insights on more abstract ones. If you’re so inclined, you can catch me on twitter through out the week, as I’m sure there will be no shortage of play by play updates from WWDC land.

This is totally awesome. I sent an ad-hoc beta version of the Young Twinn iPhone app to Young Twinn’s management, and Twinn is already raving about it on Twitter, which can be viewed on the Young Twinn iPhone app! Given how long the application approval can take, can you say an-ti-ci-pa-tion?

I first met Young Twinn four years ago for some mix work I did for him and his management company Undertaker Entertainment. This kid is a real talent and has definite cross over appeal. He’s based in Houston and he’s definitely worth checking out. When Undertaker was looking to do an iPhone app, I of course seized on the opportunity. I had other music market apps in the works and I’ve had a good on going relationship with the company. Plus what made the project desirable was that Young Twinn’s management has done a great job with his marketing and branding, so there was plenty of content for me to pull from.

Some of the cool features in this app that don’t seem to be present in other music marketing apps are support for RSS feeds for YouTube and Twitter. The great thing about this is that I didn’t have to do any server side development. Soon, I hope to have Picasa and FlickR support as well. The only reason this app is not released yet is that I have to make sure it’s iPhone OS 3.0 beta compatible, which is now a requirement for submission to the iTunes store. The video above are screenshots of the app. I couldn’t do motion capture of the iPhone sumulator because the simulator doesn’t support audio playback and YouTube embeds in WebKit.

For years I have known fashion photographer Andrew Swaine. He had done photo shoots for recording artists I’ve worked with, namely Iyeoka Okoawo and Omega Red. When I was trying to determine a “first” iPhone application to submit to the Apple iTunes store, I wanted to rapidly develop an application that would allow me to explore the inner workings of the iPhone SDK. As I’m focused on creating marketing type iPhone apps, it seemed logical to go with Andy as he had high quality content on hand.

Although this application appears to be relatively simple, there are a lot of implementation details that were definitely non-trivial. As an iPhone developer, you have to be mindful of limited system resources on the device. As Andy had over a 150 photos to display, I had to create a memory buffering scheme to load photos, otherwise the app’s memory could be maxed out and then the application could crash, which of course is something that should be avoided.

As a C++ developer, learning Objective C was a relatively straight forward although admittedly there were some new ways of doing things I needed to get used too. It turns out that a lot the great hurdles new iPhone developers have to deal with relate to the iPhone App Store itself. There’s an obtuse sequence of steps required to provision and securely digitally sign your application in which if you mess up any part of the process, you’re up the proverbial creek! It’s amazing the number times I’ve come across the word “voo-doo” to describe this process when I was researching this on the web.

Also frustrating is the approval process for application submission to iTunes in which there are no formal guideline requirements. Unfortunately as of this post, the application above is not available on iTunes yet and I submitted the app 12 days ago! The only feedback I’ve gotten back thus far was that things were “requiring unexpected additional time for review”. Fingers crossed that it will be approved the first time through, because I would hate the prospect of having to go through another round of this. In the mean time, I do have the video reel above. It’s a screen motion capture of the application running though an iPhone simulator provided by the iPhone SDK. I promise, there were no pixels photoshopped in this process!