Archive for the 'iPhone' Category

I was fortunate enough to be invited to speak at the TED@London salon event at the British library back in April. This is a TED curated event that is part of their global talent search. The best speakers will be invited to speak at TED2013 in California. It was a fun and thought-provoking evening and it was great to meet so many of the speakers, organisers and attendees.

I spoke about my mobile art apps. You can see the talk here – and now below. If you enjoy it please vote and leave some lovely comments 😉

In the coming years, once Augmented Reality glasses are commonplace, I hope to see apps that allow for creative user generated content to be distributed in virtual space. These tools would give people the opportunity to virtually build sculptures, paint on walls and leave trails using gesture and voice as they wander through cities. Others plugged into the same AR channel would see these digital artefacts seamlessly attached to the places they were created. Apps such as TagDis have explored this area, but, as with all mobile AR, there is a sizable barrier in the act of navigating to and using an app on a handheld device. Ubiquitous AR experienced via a head mounted display could transform whole cities into virtual art galleries.

Konstruct explores this vision by investigating generative art in an Augmented Reality environment. It is a sound reactive AR experience for the iPhone that allows the user to create a virtual sculpture by speaking, whistling or blowing into the device’s microphone. A variety of 3D shapes, colour palettes and settings can be combined to build an endless collection of structures. Compositions can be saved to the device’s image gallery.

Konstruct is a free app available on iPhone 3GS and 4 running iOS 4+. A version for the iPad 2 is planned in the coming months.

In terms of the credits, I came up with the concept and developed the app. Juliet Lall was responsible for designing the user interface and all branding including the Konstruct website.

Users are encouraged to email virtual sculptures to konstruct[at]augmatic.co.uk to be featured in the Flickr gallery.

Vodpod videos no longer available.

So on to the technology. I used the new iPhone tracking library String. I’ve been experimenting with the beta version of String for a few months now and it’s surprisingly powerful – I managed to fit over 100,000 polys into a scene without any noticeable dip in frame rate. The level of accuracy in the tracking is also very sturdy. I’m told that that this has improved considerably with the latest iteration of the beta SDK so I’ll be updating the app very soon.

There are currently 2 approaches to developing String apps, Unity, and straight up OpenGL ES. With the Unity option, developers can be up and running within minutes. Being a sucker for punishment I decided to take the OpenGL ES and Objective-C route. This was my first proper experience using OpenGL. Coming from a Flash background, where most of the hard work is wrapped up for you with libraries such as Away3D and PaperVision3D, it was a bit of a learning curve. Ultimately though, it was a great experience to learn about and implement buffer objects, matrices, normals etc. I look forward to learning more about OpenGL and GLSL for future projects.

OK lots of things to mention so I thought I’d be lazy and lump them into one blog post.

I’ve spent the last month beta testing a new iPhone tracking technology called String. I’ll leave the details for a future blog post, but I will say that it is one of, if not the fastest and most accurate mobile trackers I have seen to date. Also, importantly, there is no license fee for artists and designers wanting to experiment with and publish apps using the tech. It’s about to be released any day now. Definitely one to watch.

I’m currently experimenting with generative art in AR space using OpenGL ES. Should the app be accepted into the store, it will be included in the Square Art exhibition. The theme is Blank Canvas, which ties in freakishly well with what I have in mind.

I just had an article published on the Young Creative Council (AKA YCC London) blog. Every week they get a guest author to write about whatever they like. I used it as an opportunity to write about my work, my inspirations and my thoughts on AR in terms of the current state of play, the future and my approach to it as a platform for art. I don’t tend to use my blog as an outlet for those sorts of ramblings so it was nice to write down my thoughts on these matters for a new audience.

You can read the article here. Big thanks to Laura and Emma for sorting this out.

I’ve been asked to both speak at and be on the advisory board for the AR Summit. This event is penned to be the biggest augmented reality conference in the UK and is taking place in London in June. If you don’t fancy the presentations, there will be an exhibition hall showcasing the latest and greatest in the world of AR. Definitely worth a visit.

It’s a huge honour to be given this opportunity. I’m looking forward to having a say in the organisation of the event. Hopefully I can inject a creative edge into the proceedings.

I was featured in this month’s Web Designer magazine. The article is titled “Whatever happened to Augmented Reality?”. I attempt to answer this question and give my thoughts on current trends, mobile AR and the future. I’ve been informed that I also had an interview published in the iPhone special of Computer Arts Projects about my iPhone app Fracture last month. I didn’t actually manage to get a copy so if anyone has the issue and a few spare minutes I’d really appreciate a scan.

I’m both excited and relieved to announce that my first iPhone app has been approved and is now available in the App Store.

Fracture is the latest in a series of cubism based applications that include Self Portrait? and the WiiMote portrait generator. This time, however, you’re able to paint using your own photos whilst on the move. You can also save them to your image galley.

The basic premise behind cubism is to represent the subject from a range of viewpoints in a 2 dimensional painting. The subject is fragmented and reassembled to provide an abstract 3D form. Aesthetics employed by masters such as Pablo Picasso and Georges Braque have been studied and recreated in this painting tool. I’ve particularly attempted to recreate the style of the Analytical Cubism movement which was developed between 1908-1912.

Portrait of Ambroise Vollard – Picasso

Violin and Candlestick – Braque

Fracture was built using openFrameworks. Several members of the OF community, including Memo Akten and Zach Gage collaborated to write the ofxiPhone addon and a wide range of wrapper libraries. This allows OF users to build iPhone and iPad apps without having to learn (much) Objective-C. It was great to able to use my existing skillset and jump straight into iOS development.

I did, however, have to use Objective-C when it came to developing the interface. This was by far my biggest challenge. I found that I couldn’t take advantage of many of the tutorials and standard Apple interface solutions, such as the Navigation View. This was due to the fact that OF projects are set up differently to standard Obj-C projects.

So install it, have a go and please email your best paintings to fracture [at] augmatic.co.uk to be featured in the Fracture Gallery. And if you like it, don’t forget to comment.

Vodpod videos no longer available.

***** EDIT – 28/11/10 *****

I’ve been really surprised by the level of interest in Fracture. It has been featured by the following:

I’m currently knee deep in iPhone development (via openFrameworks), and about to submit my first app any day now. To keep inspired, I’ve undertaken a great deal of research into Arts based mobile apps and have collated the best of them into a new Vimeo channel – Mobile Art. So have a look and feel free to sign up if you have a Vimeo account.