In 2006, researchers at the NHK demonstrated in Las Vegas a transmission of Ultra High Definition Television with 22.2 surround sound. THe broadcast was from Tokyo to Osaka via an IP network running at 1Gbps. Uncompressed, the sound signal alone an at 20 Mbps while the video signal ran at 24 Gbps.

Developed by the NHK in Japan, ultra HD has 4000 scanning lines compared to just 1080 for the current broadcast system.

In 2007, SMPTE approves the Ultra HDTV as a standard format.

The BBC will broadcast the London Summer Olympics in ultraHD.

Each frame is equal to 33 Megapixels.

I can see this as the digital IMAX but more. The 100 degree viewing angle allows for an image that can simulate human perception. It’s quite hard to describe since the image is huge but the experience is almost realistic.

This type of imaging is a step forward to building that holodeck. The amount of detail that the resolution provides will be able to show more infomation for computers to see. Although current limitations would be enough processing power to process the data.

I have no name for this as of yet but over the past few weeks I’ve been trying to recreate what I built five years ago. Sadly through the updates that Apple has made, it is no longer possible with their technology. Streaming video from one place to another seems simple enough and is actually simpler today than it was when I was building it. But getting it to place nice with Java is another thing completely.

I was using both VLC and QuickTime Broadcaster to broadcast video from one place to another without the aid of a server to be imported into a Processing applet and that didn’t turn out too well. Apple has discontinued further development on QuickTime for Java in lieu of AVFoundation. So the Quicktime that we all grew up with no longer exists. the core of AVFoundation is used to run movies in iOS, iMovie and FCPX which is why iMovie runs faster and somewhat better than FCP 7. Oh the pains of 64 bit processing.

The QT Broadcast stream, though encoded in MotionJPEG which Java recognizes, still starts with a QT header. Thus when the stream is imported into the sketch, it will not recognize it as a MotionJPEG file but as a QT streaming file. This would require a decoder to transcode it into MotionJPEG that Java accepts.

Installing a Lion server will not help. I learned that the hard way.

So currently I’m using an application called EvoCam. It’s a standalone software that recognizes practically any camera I plug in and allows a ton of options. From motion capture recording/ streaming video to grabbing stills and sending them through applescript or automator workflow. I’ve been having ftp connectivity issues to the ITP server so I’m uploading this to my personal site. So far it’s been working great save for the times of me getting disconnected on the remote end for some reason or another.

Refresh the page to get the latest image.

Once the image was finally free on the web it was easy enough to access the code in Processing. I’ve always wanted a holodeck no matter how crude it may be. I think the fact that you are walking through a given space that changes would change on how we would interact with man made images or environments. So this is a very crude version of how I would imagine things. More updates to come.

There are views that more cameras out there could mean two things. One is that We can finally get a grasp on what’s going on in the world. No longer can dictators and criminals hide from us. For once we can finally generate our own opinions on subjects that before, took an army of journalists to capture and analyze. We can finally have our own opinion. Then of course there’s the downside. It’s who is behind the cameras is the scary part. We already live part in that world. Our every movement is captured and stored into servers for who knows how long.

Cameras enabled it’s creators to preserve their time and space and it continues to do so today. The 2011 Japan Earthquake was so devastating that we were getting live images as the tsunami swept through the northern region. Users shared their videos of the quake as it was happening and for the first time, the word could see terrible disaster live.

I for one would like to be optimistic about where technology is leading us in terms of cameras. I long for the images of the old cities in my home. I wish I could re-create the city the way it was before World War II or even better, re-create Old Manila during the Spanish era. We would be able to take a walk into history so to speak, understand and experience the place and time where my grandparents and great-grandparents lived. Like a living holodeck based on information from the past.

Cameras are something we fear about today. But it’s something that our descendants would look for in the future.