One of the most interesting things Apple announced at WWDC was a new framework designed to allow developers to embed machine learning capabilities into their apps. One of the things Core ML can be trained to do is identify and caption real-world objects in real-time, which Apple demos via a sample app trained with 1000 objects …

NordVPN

Developer Paul Haddad posted an amusing tweet testing the capability. While the screen recording showed the iPhone recognising a screwdriver, keyboard and carton, it didn’t seem to recognise one familiar object: a first-generation Mac Pro.

The Core ML framework mostly identified it as a speaker, though from one particular angle it decided it was instead a space-heater. You can see the recording below.

Amusing as it is, my take-out was just what an impressive job it did on the other objects. It’s going to be fascinating seeing what developers do with this kind of capability. It would, of course, be possible to train the system specifically to recognize Mac models.

Via TNW

You can follow iPhoneFirmware.com on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Apple and the Web.