MojoCon Ireland is the first international conference focusing on mobile journalism, mobile filmmaking and mobile photography all in one event. The future is mobile, the future is now.

This is a two-day event with a list of guest speakers that reads like a Who’s Who of mobile journalism & filmmaking. I’m proud to be included among their ranks. If you’re interested in mobile media making, this is event you’ve been waiting for.

My session “Smartphone Filmmakers” is scheduled for March 27th, at 2pm. I’ll be speaking alongside Conrad Mess (award-winning filmmaker), Michael Koerbel (award-winning filmmaker), Neill Barham (creator of FiLMiC Pro), and Ricky Fosheim (producer/director/editor of one of the first feature films shot entirely on an iPhone). This will be a fantastic session, and I’m hoping to meet some of my international readers.

Visit the MojoCon website to learn more about the event, and to get your tickets early (avoiding a sellout). See you in Ireland!

This week, we’re going to look at how Fountain is making it easier to write screenplays on your iPhone, iPad, and iPod touch (as well as your Android device, Blackberry, Windows phone, Palm Pilot, Mac, PC, Apple II, and TRS-80 — yes, really).

When the iPhone was introduced seven years ago, the only true writing tool was the native notepad app, a bare-bones, nearly featureless text editor. I tried writing a few script pages, complete with all the necessary complex screenplay formatting, in this app. I quickly concluded that writing a screenplay in any text editor was not only a horrible idea, it was also a very horrible idea.

Today the App Store offers several mobile screenwriting apps that provide feature-rich authoring environments designed specifically for pounding out screenplay pages (i.e., Celtx Script $4.99, Storyist $9.99, and Final Draft Writer $39.99, etc.). All of these apps are extremely powerful, and yet, I’m using them less and less. Why? Because I’ve fallen in love with writing screenplays in… wait for it… text editors.

Fountain is a comprehensive, easy to learn, simple to implement, free to use screenwriting syntax based on screenplay standards that already exist (most of which you already know or can learn very quickly). With this new syntax, you can write your script as plain text in any text editing apps (not just screenwriting-specific programs), on any platform (yes, even a TRS-80), without having to worry about indents, margins, and other formatting particulars. And, because you’re writing in plain text (which is platform agnostic), you can bounce between apps and platforms at any time (i.e., Begin writing in TextEdit on your Mac, continue writing in Editorial on your iPad, and finish up in JotterPad on your Android tablet). When you’re finished writing, simply feed your text through a Fountain compatible converter, and presto! You’ve got a properly formatted script! It’s like magic, only more magical!

That was a lot to digest, so let’s break it down:

First you need to learn the Fountain syntax. Here are a few examples: Start all scene headings (slug lines) with INT, EXT, or something similar. Write action as you normally would. Keep character names in UPPERCASE. Dialogue always follows character names. Easy peasy! If you already know how to write a screenplay, you’ll pick up the Fountain syntax in about 60 seconds! A complete description of the syntax can be found on the Fountain website.

Next, pick the app (and platform) you want to write in. Which text editing apps do I use on my iPad? Too many to list. I try new ones all the time. A quick search in the App Store revealed well over 100 text-editing apps, each offering its own unique feature set. That’s a lot of options! Tech guru Brett Terpstra has compiled a comprehensive, interactive list of the best iOS text editors that should help make your choice a little easier.

Lastly, you’ll need a Fountain converter to transform your text into proper screenplay format. Fountain is still in its infancy, but a number of conversion tools are already available. Highland for Mac is my current favorite. It converts Fountain text into PDF files and Final Draft documents. It will even convert existing Final Draft docs and PDF files back into Fountain text – a very neat trick! Once again, I suggest you poke around the Fountain website to view their up-to-date list of conversion utilities and related applications.

Writing a script is hard enough without having to remember specific margin settings and confusing shortcuts within dedicated screenwriting applications. By using Fountain, that complexity begins to melt away. Write on!

]]>http://www.handheldhollywood.com/latest-news/write-screenplays-on-your-ithings-with-fountain.html/feed0I Screen, You Screen, We All Screen for Green Screen: Part IIhttp://www.handheldhollywood.com/latest-news/i-screen-you-screen-we-all-screen-for-green-screen-part-ii.html
http://www.handheldhollywood.com/latest-news/i-screen-you-screen-we-all-screen-for-green-screen-part-ii.html#commentsTue, 01 Jul 2014 16:45:40 +0000http://www.handheldhollywood.com/?p=7445Welcome back green screeners! As you may recall, last week I explained how green screens help filmmakers to composite (combine) multiple layers of imagery. Don’t remember that? Here’s a 13 word reminder (not including “presto”): Shoot actor against green screen, remove green in post, add a new background — Presto! While the green screen process sounds simple enough, there are a few technical hurdles you’ll need to jump, and thanks to some brilliant app developers, your iThings will help you jump higher.

Last week I proclaimed my love for an awesome app called Green Screener, a tool designed to measure and clearly display luminance values in order to help you light your green screen evenly (and achieve beautiful composites). Now let’s look at another problem that needs solving: shot alignment. To understand the problem, let’s revisit at last week’s example (with a twist):

On day one, you decide to use an elevated camera to shoot the side of a cliff with a distant army marching in the valley bellow. On day two, you position a camera low to the ground to film a nervous looking actor in front of a green screen. Then, in post-production, you composite the two together. You had HOPED to create a shot in which the nervous actor is now seen standing on the cliff, observing the approaching army. However, because your camera positions don’t align (one was elevated, and the other was near the ground), the composite doesn’t work at all! It’s awful! Thank goodness this was only an example…otherwise I’d have to call your mom and explain this horrible composite.

In order for this shot to have worked, you would have needed to capture both shots from a similar angle. This is shot alignment. The more aligned your shots, the better the composite will look. So, how can you ensure good shot alignment while shooting your actors against a green screen? After all, you can’t see the intended background — it’s just green. Lots of green. Greeny green green.

iDevices to the rescue! I’m happy to report that there are several tools in the App Store that will perform LIVE green screen composites right on your mobile device. How does that help? Imagine being able to load your background imagery into your iPhone or iPad, pointing your device’s camera at your actor and green screen, and then seeing your actor composited into the background — LIVE. Now, simply by moving your iDevice around your set, you can find the best angle for shooting your actor, one that perfectly matches the background. Once you’ve found the right angle, move your video camera into place, and start shooting! Awesome-sauce!

Here are two apps that can composite live green screen video (there are more in the App Store, but these are my faves):

Green Screen Movie FX ($1.99) is simple to use with only a few controls. After launching the app, just point your device at your actor and tap the green screen background. The app instantly removes the green, replacing it with a default background. Naturally, you can load your photo or video backgrounds from your iPhone’s photo library (which is the whole point). If you have color or luminance variations in your green screen (there shouldn’t be — didn’t you read last weeks article?), you can continue to tap areas of your green screen that will be removed by the app (you can remove up to four colors). A slider controls the strength of the green screen removal. Higher settings work better on poorly lit green screens, but can eat into non-green portions of your image, such as your actor’s face. The app also offers settings to change the video quality, frame rate, focus mode, and more. Green Screen Movie FX is a solid app I rely on quite frequently, but there’s more fish in the sea…

Veescope Live Full ($2.99) provides far more control for fine tuning your composite, including the ability to reposition the background (a feature that has come in handy more than once). The app also lets you lock exposure, focus, and white balance. While writing this article, I discovered an exciting feature within Veescope Live Full I had never seen before. It’s called Light Guide, and it’s a light meter that displays luminance values as bands of gray and green. Sound familiar? It should. That’s the same function Green Screener (the app I covered last week) performs. I haven’t used Light Guide on set yet, so I can’t vouch for its accuracy, but I’m excited to give it a try. If you’re feeling frugal, look for a free version of this app called Veescope Live (no “Full”). It performs the same functions as the paid version, but watermarks your recorded videos. Wait… RECORDED VIDEOS? What chu talkin bout, Taz?

I’m recommending these apps as shot alignment tools, however, both apps can RECORD videos. That means you can actually shoot live green screen composites directly to your device, then share or edit them right away. I didn’t mention this at first because the quality of the composites isn’t anywhere close to professional. That said, if you’re just looking to have a quick bit of fun, and you don’t need top-quality results, recording with these apps may be all you need! No harm in trying, right?!

And remember, shoot carefully because “it’s not easy being green screen.” Yes, I waited a week to type that. You’re welcome, universe.

]]>http://www.handheldhollywood.com/latest-news/i-screen-you-screen-we-all-screen-for-green-screen-part-ii.html/feed0I Screen, You Screen, We All Screen for Green Screenhttp://www.handheldhollywood.com/latest-news/i-screen-you-screen-we-all-screen-for-green-screen.html
http://www.handheldhollywood.com/latest-news/i-screen-you-screen-we-all-screen-for-green-screen.html#commentsTue, 24 Jun 2014 17:05:27 +0000http://www.handheldhollywood.com/?p=7435What did you do last week? I mercilessly tore my living room apart and temporarily transformed it into a green screen (a.k.a. “chroma key”) studio in which I directed a short comedy called 2084. For as long as I can remember, my friends and I have joked about shooting 2084 on a massive soundstage with high-end gear and a huge crew — total overkill for this very silly, 3-minute short. Well, after ten years of that not happening, we finally decided to shoot 2084 with a tiny crew (four of us) and a budget just large enough to cover lunch.

While I didn’t shoot the film with my iPhone (I used a Blackmagic Cinema Camera), I did rely heavily on my iPad to improve the quality of my green screen. I’ll explain how in just a moment, but first I want to give you a quick green screen overview. Skip ahead if you’re already a green screen pro… or hate zombie jokes.

To get started, actors (or other foreground elements) are shot in front of a large green background (a “green screen”). Then, in post-production, the green areas are digitally removed, allowing the actors to be combined with other layers of imagery. This process is called compositing. For example, you might shoot your friend standing in front of a green screen looking profoundly concerned. Then, in your compositing application (i.e., Adobe After Effects, Nuke, Apple Motion, iMovie for Mac, etc.), you would remove the green, add a new background, and make it appear that your friend is standing on a cliff, observing an approaching army of mutant-spider-riding zombie pirates! (Mutant Spider-Riding Zombie Pirates copyright 2014 Taz Goldstein… maybe.) Green screen photography is a crucial component of nearly every big-budget summer blockbuster, but it’s based on the same technology that places cheesy meteorologists in front of cheesier weather maps!

Before you ask, yes, other colors can be used. “Blue screen” is quite popular, and I’ve even shot on an orange screen. That said, green is the most common. As you might expect, professional (and expensive) green screen materials, paints and surfaces typically yield the best results. However, with lots of patience and practice, you can achieve amazing results with much less expensive alternatives.

Now that you understand the basic concepts, you need to understand the biggest problem filmmakers face when shooting green screen — uneven lighting.

To perform properly, green screens must be evenly lit. Why? Because its much easier for editing or compositing applications to remove a narrow range of green shades, rather than a wide range. To put it over-simplistically, if your green screen is evenly lit across its entire surface, the computer sees it as a single shade of green, which requires less effort to remove. If your green screen is unevenly lit, with light and dark areas, the computer sees it as multiple shades of green, making it harder to remove. The harder your green background is to remove, the rougher and more obvious the edges of your composited subjects will appear, making everything look green screened and completely fake, as in, “Dude, that astronaut looks green screened and completely fake!” So, how can you quickly ensure your green screen is evenly lit? With your iThing, of course!

The simplest, fastest and coolest way to identify uneven green screen lighting is with an awesome app called Green Screener ($9.99). To get started, you simply position your iPhone or iPad in front of your set, pointing its camera towards the lit green screen (tripod mounts from Studio Neat, Square Jellyfish and Padcaster are quite helpful for this sort of thing).

If you tap one of the gray bands on the screen, it turns green — this is now your target luminance. As you reposition your lights, bringing each area of your green screen into the target luminance range, it turns green on your device’s display. The idea is to turn as much of your device’s screen green as possible. Think of it as a carnival game: Move the lights, turn everything green, win a crappy stuffed toy (and an evenly lit green screen)! Yes, I’m simplifying the process a bit, but not by much. It really is a matter of setting a target, and then moving lights to match it. With an evenly lit green screen, your final composites stand a much better chance of looking silky smooth and artifact free.

Green Screener saved my bacon during last week’s shoot — or should I say, it helped keep my bacon from needing to be saved. I truly love this app. Still not convinced of its supercalafragalisticness? Consider this: The app was invented by Per Holmes, the same guy who gave the world Shot Designer, “Hollywood Camera Work” and “Visual Effects for Directors” (a brilliant training series that I consider required viewing for all filmmakers who are serious about visual effects). Bottom line: If you own an iThing, and you light green screens, get Green Screener.

More green screen fun coming next week when we take a look at an app that will help you line up your green screen shots, and even composite green screen video right on your iDevice! Just think of all the moon landing videos you can fake right from the comfort of your own underground government compound!

]]>http://www.handheldhollywood.com/latest-news/i-screen-you-screen-we-all-screen-for-green-screen.html/feed2Are iOS Camera Apps Helping or Hurting Filmmakers?http://www.handheldhollywood.com/latest-news/are-ios-camera-apps-helping.html
http://www.handheldhollywood.com/latest-news/are-ios-camera-apps-helping.html#commentsTue, 03 Jun 2014 16:00:25 +0000http://www.handheldhollywood.com/?p=7419While that overtly sensational headline may reek of clickbait, it’s also a fair and reasonable question – one I was asked while speaking on a Photography/Videography panel at this year’s Macworld Expo in San Francisco. Jeff Carlson, the panel’s moderator (and all-around-photography-guru), was asking me specifically about the new wave of filmmaking apps that attempt to guide the filmmaking process, and automatically correct their user’s mistakes.

It’s a tough question to answer. On the surface, it appears that these applications are replacing creative control with pre-defined scripts and algorithms… but maybe there’s more to it than that. Let’s take a closer look at these “assistive apps.”

Horizon is one such app. It’s a video camera that insures a properly leveled image. Using the iPhone’s internal sensors, Horizon quickly detects when the camera is being held off balance, and automatically compensates (in real time) by digitally re-leveling the image, so the horizon line always appears parallel with the top and bottom edge of your shot.

So, is this helping or hurting filmmakers? Well, on one hand, this might be a godsend for new shooters who don’t yet understand the importance of level shooting. It also prevents them from shooting in the unforgivable portrait orientation. On the other hand, Horizon’s auto-corrected rotation could deter filmmakers from using Dutch angles (intentionally un-leveled angles). Experienced filmmakers use Dutch angles to alter the audience’s perception or add style to a shot. They can also be used to emphasize (or de-emphasize) the subject. And, most importantly, they can be used to enhance cheesy action sequences (see “Batman“). Fortunately, Horizon does provide a “lock” button that temporarily turns off the auto-rotation (you just have to remember that it’s there).

Several new camera apps like Vine, Spark, and Cinamatic, limit users to short takes. Lengthy shots are simply not allowed. Helping or hurting? Well, I could argue that these apps are teaching new filmmakers the importance of pacing. Nothing screams “newbie” more than long, uninteresting shots (except maybe, poor audio). Conversely, removing the ability to shoot longer takes greatly reduces the type of stories that can be captured.

Apps like Directr and iMovie take a more hands-on approach, telling their users what subject matter to shoot (based on a variety of pre-defined scripts or storyboards). The apps then automatically edit everything together, complete with titles and music. Is this helping or hurting? These types of apps can help inexperienced filmmakers understand the basics of pacing and structure. But, as with the previous examples, they can also limit creativity. Users are exposed to structure, but not encouraged to manipulate it in any meaningful way. The end results are almost always entertaining, but somewhat homogenous.

Here’s another example. A couple weeks ago, I covered an app called Emulsio that stabilizes shaky, hand-held footage, and even removes rolling shutter artifacts. ZeroShake is another app that performs a similar function. Helping or hurting? On one hand, having the power to remove shakes may lure filmmakers into a false sense of security, and dissuade them from stabilizing their cameras while shooting. On the other hand, after observing how stabilized footage increases a video’s production value, filmmakers may be inspired to properly stabilize their cameras on set, which will always yield higher quality results (and less cropping) than stabilizing footage after it’s been recorded.

So, are these “assistive apps” helping filmmakers by allowing them to create the best possible results, or are they hurting filmmakers by removing elements of creative control?

When I started writing this article, I was convinced it would end with me ranting like a 90-year curmudgeon (i.e., “You spoiled kids today, with your crazy hair, and your rock and roll, and your incredibly powerful digital assistants!”). I assumed my conclusion would include phrases like “These apps are crushing creativity!” and “Get off of my lawn!”, but instead, I find myself optimistic and excited. Thanks to the current wave of assistive-filmmaking apps, ** and their potential to teach by example**, today’s up-and-coming auteurs are being given an invaluable introduction to the basics of visual storytelling. All they need to do is pay attention! Once a filmmaker understand the basics, she/he can migrate away from these apps and move towards tools that offer greater creative control. And, of course, these apps allow non-filmmakers (who have no interest in becoming filmmakers) a means of producing terrific results. It’s all good!

That said, you should still get off my lawn.

]]>http://www.handheldhollywood.com/latest-news/are-ios-camera-apps-helping.html/feed1How I use my iPhone to charge all my other batteries. Wait. What?http://www.handheldhollywood.com/latest-news/how-i-use-my-iphone-to-charge-all-my-other-batteries-wait-what.html
http://www.handheldhollywood.com/latest-news/how-i-use-my-iphone-to-charge-all-my-other-batteries-wait-what.html#commentsTue, 29 Apr 2014 16:00:09 +0000http://www.handheldhollywood.com/?p=7273Want to hear a funny story? Not too long ago, I drove 4 hours to a last-minute, geographically isolated, commercial shoot with an SUV full of professional camera and audio equipment, only to discover that most of my gear’s rechargeable batteries were nearly dead! Oh wait, I meant “a horrible story,” not “a funny story.” Thanks to a very resourceful gaffer, I survived the day (and the shoot), but vowed never to be surprised by another dead battery.

I set out to find a solution that would allow ALL of my batteries to be fully charged on short notice, but would not require them to be charging at all times. Turns out, the solution was deceptively simple, and required my iPhone– NERD BONUS!

First, I made sure each of my batteries was in its own charger (I picked up a few low-cost chargers from eBay), then plugged them all into three power strips (Accell’s Powramid Surge Protectors do a nice job of accommodating multiple oversized plugs). I plugged those power strips into a single extension cord that was, in turn, plugged into a WeMo — that’s where the magic happens.

What’s a WeMo? Belkin’s WeMo Switch is a Wi-Fi enabled, internet-connected power outlet that can be controlled from anywhere using its partner app on a smartphone or tablet. It’s like having a portable on/off switch in your pocket. Now, when I’m away at a meeting, and find out that I’ll be shooting the next day, I reach for my iPhone, tap the virtual power button inside the WeMo app — remotely turning on my WeMo Switch, and instantly begin charging my batteries at home!

If you want to take your WeMo setup to the next level, check out If This Then That, a remarkable automation tool that interconnects your existing online accounts and compatible hardware. With some clever scripting, I’m guessing you could have your batteries automatically begin charging the moment you get an email from a specific client, or when someone tweets a particular hashtag, etc. Nerdrific!

On a side note, I once used a WeMo switch to remotely turn on a light inside a building while shooting a nighttime exterior. It was a tiny crew, and we didn’t have a spare person to control the light, so I used the WeMo app on my iPhone, and powered up the light from outside. It wasn’t a perfect solution, and the timing took a little practice, but it worked!

While the WeMo Switch was originally designed for basic home automation (i.e., turning on the toaster oven from bed), it’s become a powerful addition to my filmmaking bag of tricks… While also turning on my toaster from bed. What? Who doesn’t like toast?

Thanks to its stunning Retina display, Apple’s iPhone makes a perfect portable screening room. I use mine to view dailies while on location, to reference rough cuts while working with sound effects designers, and so on. I’ll even use it to show-off relevant videos while giving impromptu elevator pitches. Unfortunately, as awesome as the iPhone’s screen may be, its audio playback sorta blows…specifically, its low volume. However, one company has created a clever iPhone case that can help turn things up to 11.

Allow me to preface this completely unbiased iPhone case review by saying, I really hate iPhone cases. Like many Apple products, the iPhone is a true work of art. Every curve has been designed and refined to accentuate its subtle beauty. I’d frame it if it wasn’t busy helping me make movies. Sure, I understand the need to protect your investment, but covering an iPhone with some flimsy piece of leopard printed plastic is a bit like wrapping the Mona Lisa in tinfoil… you know… so it’s safe.

With versions for the 5/5S and the 5C, the Bandshell iPhone case addresses a major shortcoming of the iPhone’s design — its unfortunate speaker placement. For some reason, Apple positioned the iPhone’s speaker on the device’s bottom edge. So, when you’re watching videos (dailies, rough cuts, etc.) on your iPhone’s screen, the speaker is facing away from your ears! In quiet locations, this isn’t a problem. But, if you’ve ever tried to watch and listen to your iPhone in a noisy environment (like most film sets), you know it’s an exercise in frustration.

At first glance, this case looks like most — a very snug plastic and rubber shell that protects the back and sides of your iPhone. The Bandshell works its volume-boosting magic by providing a small plastic door that slides out from the back of the case, and redirects your iPhone’s audio towards your ears. When not in use, the door disappears back into the case. Simple as that! In fact, it’s such a simple idea that I wondered why other case manufacturers hadn’t already ripped off the design… and then I noticed the design patent notice on the Bandshell website. Looks like this protective case is well protected!

Curious how much volume you gain with a Bandshell case? I was wondering the same thing, so I performed an entirely un-scientific experiment that should be completely ignored. But, as long as you’re still reading, here’s what I did. I created an mp3 audio file containing a steady tone, and played it back on my iPhone 5S. I measured the playback volume level using my Zoom audio recorder (positioned about 6 inches away). Then, I repeated the same steps with the Bandshell’s sound door extended. According to my readings, at that distance, the volume level rose a full 4dB. In truth, it’s not a huge change, but it’s enough to make an effective difference. That said, if you’re trying to hear your iPhone in the middle of Grand Central Station, those 4dB aren’t going to help much.

In addition to boosting your iPhone’s audio, the Bandshell case also hides a terrific, pop-out kickstand! Sweet! With this case’s two-feature combo, you can set your iPhone down on a flat surface at a very nice viewing angle while actually being able to hear it!

The Bandshell does have a couple minor drawbacks. While inside the case, your iPhone’s volume and sleep buttons lose their ‘click’ and become a bit spongy. If your iPhone isn’t seated just right inside the case, the sleep button can easily be pressed accidentally. Lastly, the cutout revealing the silencing slider is a bit small for a big-mitted dude such as myself. While the Bandshell is certainly a nice looking case, it will never match the beauty of the bare iPhone. That said, that same observation could be made of just about ALL iPhone cases… so, I suppose the point is moot. These really are minor issues, especially if you’re already using (and used to) other iPhone cases.

In conclusion, if I’m going to hide my treasured iThing in a case, that case had better add more than it takes away. It had better improve my iPhoning experience. In other words, it had better be exactly like the Bandshell. If you need to turn things up to 11, this is the case for you.

The Bandshell is available for the iPhone 5/5S as well as the 5C. It comes in a variety of two-color combinations (i.e. blue & white, red & black, etc.), and retails for $29.99 ($25 on Amazon).

BANDSHELL CASE GIVEAWAY

]]>http://www.handheldhollywood.com/latest-news/bandshell-case-giveaway.html/feed0iWorld 2014 – iPad and Macro lenses from ollocliphttp://www.handheldhollywood.com/latest-news/iworld-2014-ipad-and-macro-lenses-from-olloclip.html
http://www.handheldhollywood.com/latest-news/iworld-2014-ipad-and-macro-lenses-from-olloclip.html#commentsFri, 04 Apr 2014 19:30:24 +0000http://www.handheldhollywood.com/?p=7320No stranger to iPhone videography, olloclip has been producing clip-on lens accessories for years! Their signature product is the 4-in-1 clip-on lens system for iPhone 5/5S and 4/4S. To use it, you simply slide the clip over the edge of your iPhone, positioning one of the olloclip’s four lenses (Fisheye, Wide-Angle, 10x Macro, 15x Macro) over the iPhone’s built-in lens. Easy peazy and fasty wasty!

This year, the company was giving sneak peeks of their soon-to-be-released clip-on lens system for the iPad! Same variety of lenses, new iPaddy goodness!

Olloclip was also showing off their clip-on macro lenses. Most macro lenses can be tough to keep in focus, but the new olloclip 3-in-1 Macro lens system ($69.99) features a frosted plastic barrel that insures your camera will land at exactly the right distance for perfect focus. The frosted plastic also captures soft, diffuse light for lovely, evenly lit shots.

Olloclip has yet to release a product that I didn’t want. And, lucky for me (and you), their lenses are super easy to track down! You’ll find them in the Apple Store, Target, Samy’s Camera, the Sprint store, Best Buy and even automated airports kiosks. Yes, really. Go get some!

I’ll use it to review dailies (footage taken at a previous time), view rough cuts sent over from my editor, and even use it as a live field monitor when shooting with the Teradek Cube, an amazing transmitter that can beams a video camera’s signal straight to your iDevice. In short, the iPad is an ideal way to view video while on location. That is, until the sun comes out, at which point, it transforms from a gorgeous video display into a horribly reflective, mostly-useless makeup mirror.

Thankfully, Macworld/iWorld has brought us another treasure in the form of the Hoodini — a collapsible iPad shade from Hoodivision. The shade uses a magnetic band and nano-suction tape (anything with ‘nano’ in its name is sexier – scientific fact) to quickly connect (and stay connected) to your iPad, giving you a completely shaded view of your screen. When you’re done, it just pops right off, and folds back up. Not only does it help avoid reflections, it also keeps your iPad cooler!

The Hoodini is available in a variety of colors, with separate models designed for full size iPads and Minis. They don’t have one for the iPhone, but I’m pretty sure you can just use your hand for that.

]]>http://www.handheldhollywood.com/latest-news/iworld-2014-hoodini-ipad-shade.html/feed0iWorld 2014 – My Cloud from Western Digitalhttp://www.handheldhollywood.com/latest-news/iworld-2014-my-cloud-from-western-digital.html
http://www.handheldhollywood.com/latest-news/iworld-2014-my-cloud-from-western-digital.html#commentsThu, 03 Apr 2014 21:01:51 +0000http://www.handheldhollywood.com/?p=7315It’s not often I’ll cover a hard drive on HHH. In fact, I think this is the first time. So what makes My Cloud unique? It’s a stand-alone personal “cloud” that can be used as remote storage for the videos you shoot in the field. What? Ok… let’s back up.

If you’re shooting a ton of video on your iPhone or iPad, you’re eventually going to run out of storage space. What then? Well, if you’re close to home or have a laptop, you can sync your iThing to your computer and suck those videos right into your application of choice (I’m still using iPhoto). With your videos safely transferred, you can erase them from your iDevice, free up all that space, and continue shooting. But, what if you’re nowhere near home, and don’t have a laptop with you? How can you offload your videos and free up space? With My Cloud!

The My Cloud drive lives on your home or office network (no computer needed). Using a dedicated iOS app, you can access that drive from anywhere, and upload your videos to it remotely. After the videos have uploaded, you can delete them from your device and continue shooting!

Downsides? Sure! Since we’re talking about gigabytes of data, so the upload could take hours even on WiFi. Speaking of WiFi, make sure you’re using it! Uploading that much data over a cellular network would cost a fortune (and take a week).

There are other terrific cloud solutions out there (Dropbox, Transporter, etc.), but I was attracted to My Cloud because of its ease of use and slick mobile app. Long upload times make it a less than perfect solution, but if you’re away from home, and in a bind, the My Cloud might just save your bacon.