True, the word "metadata" is probably as overused as it is improperly used (both verbally and technically), but I've literally been dreaming the word for about 6 months as we prepare to unveil our foundation tools for new forms of metadata and target them specifically towards new users.

One of our tools, LiVE PLAY, a tool a lot of people already know about, is slated for release on the Apple iTunes store in January. But today's efforts were a bit too exciting not to share with the community.

"The major component of the system demonstrated in this video is the LiVE PLAY Server application. It's a very powerful application with the ability to catalog, organize, and seve the clips to multiple iPads (23 is our current inventory of iPads so that's as many as we could test) as well as capture custom metadata and create exports in the form of databases, PDFs, thumbnail galleries, and even online streaming files like as in the PIX System.

The list of recommended hardware components will be released with the software package in January so users can build their own system with Li-tested components.

"This additional link shows a different configuration:
Below is an example of a single :60 clip that is played simultaneously on all 23 iPads. This simulates a take that has just been captured and every iPad user viewing the same clip at the same time. -Only each player has their own "offset" dictated by the moment in which the user chooses to hit "play," "pause," "or "scrub."

Furthermore, notice how I can go in, add some metadata tags to the clip (in the form of Performance notes in this case) and the LiVE PLAY Server saves and sends that information to all 22 other iPads in under 5 seconds. This allows the entire production to receive notes from a single person/department without any delay. And yes, a lot of the footage in this demo is Epic-which is so clean at the source it improves the fidelity of a hinted GOP, making iPads even more valuable as an immediate viewing device."

Michael Cioni of LightIron Digital recently announced LiVE PLAY:
"When the iPad was originally announced a while back, my friends Steve Freebairn, Chris Peariso and I got really excited about what we considered would become the heart of the best video playback system ever. Since then, more friends and colleagues like Deanan DaSilva, Dean Devlin, David Fincher, and Stevo Brock have been working together to increase the potential the iPad brings to the set.

The main reason we got excited was not the in-hand HD quality of the iPad, nor the mobility or size of the device, rather the potential for meta-data. iPad is still in its infancy and there are a lot of ideas that we cannot yet incorporate into iPad until the iOS version 4 is released later this year. But until then, there is a product we have been working very hard on that is starting to get used on productions that already take advantage of our on-set data systems, OUTPOST. We call this new system LiVE PLAY.

LiVE PLAY is not a monitor or a replacement for video village, rather it is the replacement for cumbersome, slow, large, costly and often inefficient video playback systems. LiVE PLAY can be fully automated, untethered, inexpensive, and in some forms does not require an operator. But above all, the pillar of strength regarding LiVE PLAY is the logging and transferring of meta-data from the set to post in near real-time. As more creatives in the production community become acclimated to progressive tools and technology (such as iPads and RED cameras) so should our ability to raise the bar-offering more control, more information, better response time, higher fidelity, more customization, and better communication in ways that are superior to previous iterations of the process. One of the saddest things I see creatives do is switch from film or tape to the RED camera and literally change nothing else. While I'm confident they can still get a good picture, they are unfortunately throwing away so much potential that they're probably better off sticking with film and tape in the first place. In other words, as advocates for a changing media ecosystem, we need to insure that the tools and processes we are developing are not as good as film and tape, rather that they are unmistakably superior. While that's no easy task, and I don't pretend to have all the answers, I do think we are on the right track.

Here is a link to a video I made that demonstrates where we see the combination of meta-data and video playback coming together."

"Directors Corey Creasey and Ian Kibbey got a huge treat today on set as roger from Light Iron showed up at call time with his lighting in a Pelican case. Called the LIVEPLAY, this system is, simply put, AWESOME!

The whole thing fits in a small pelican gun case and includes a laptop, a wireless router, a couple of teradeks, a couple of iPads and some BnC cable.

i am sure it has been described here before, but if you missed it, LIVEPLAY will take the record queue from the camera and start ingesting and transcoding the take on the fly, finishing the process 5 seconds after the camera has cut. then it sends the encoded file through the freakin air to the waiting iPads. once the take hits the iPads it populates the LIVEPLAY interface and is ready for viewing.

The directors, client, agency, wardrobe, makeup, camera dept, script supervisor or whom ever can then playback that take to their hearts content. while watching it they can take notes, click the ui pre-populated edit decision fields (good take, bad take...) all in the comfort of their directors chair.

Ian and Corey simply love it and their producer david lambert from Terry Timely from san francisco was blown away at the amount of time it saved us on set as these guys had all the footage from the day literally at their finger tips.

I will include a couple of clips below of Ian at the controls. I think in the near future when this is made available for purchase or license it will be a must have for busy DIT's."

As a Guest, you are currently unable to view/download the file attachments in this post.

More from Michael Cioni of LightIron Digital:
"LiVEPLAY Server is a sister-application that drives the iPad integration.
Using the UI of LiVEPLAY Server, the permissions can be set to insure no one can over-write each other's comments. You can also control who can view what and even if people can edit or not edit. There are control permissions for which clips are enabled for which user, watermarking tags, deletion permission, notation control, etc.

LiVEPLAY server is the control that also enables XML, CSV, and PDF exporting of the metadata. So you can setup your users names, passwords, and security options for complete control of numerous iPads from one source, organize all the metadata tags and then export them for use in pre-existing databases or even NLEs.

LiVEPLAY Speed & Quality
An average-cost WiFi base station is capable of doing over 100Mb/s (megabits per second). This is ample for stable, high fidelity pictures to stream to multiple iPads (you may have seen me stream to over 20 iPads) with enough headroom to capture more clips and send/receive the LiVEPLAY meta data tags.

The pictures LiVEPLAY delivers in conjunction with any number of capture devices (such as the Teradek CUBE), are not only good enough to judge critical focus, but basic color choices, sync, and detail in the set dressing. The files we capture and stream with stable, reliable dependency over around 2-3Mb/s (2-3 megabits per second). Attached are frame-grabs from the actual iPad from Dino's shoot that demonstrate the quality of the playback at 2.5Mb/s. The JPGs are adding compression as well, but you get the idea.
If it was a concern, you can always increase the bit rate to to 4 or 5Mbs and see more image improvements, but we have had no complaints from users regarding image quality at 2.5Mbs and we can easily stream to a dozen devices at this bitrate if we needed to.
"

As a Guest, you are currently unable to view/download the file attachments in this post.

The main reason we got excited was not the in-hand HD quality of the iPad, nor the mobility or size of the device, rather the potential for meta-data. iPad is still in its infancy and there are a lot of ideas that we cannot yet incorporate into iPad until the iOS version 4 is released later this year. But until then, there is a product we have been working very hard on that is starting to get used on productions that already take advantage of our on-set data systems, OUTPOST. We call this new system LiVE PLAY._________________Buy Runescape Gold,MapleStory Mesos,Rift Plat

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot vote in polls in this forumYou cannot attach files in this forumYou cannot download files in this forum