AuthorTopic: Why I bought an F3 (Read 13249 times)

To autofocus you're going to have to learn a few settings and try them. For tracking different options work, I could explain them, but try them all. for manual focus, just use the touch screen on what you want and the face detection is tremendous for tracking, especially if you put a person's face in the menu.

You can also manually focus the pana and oly lenses as they are smooth.

Keep ALL other settings on manual, mess with the viewfinder brightness to match your computer.

800 iso is good 1000 you can get by with, but over that you'll need some denoise software, though I've shot at 1600 without break up.

Shoot on the highest bit rate.

If you rent it try the pana leica 25mm 1.4 the oly 45, and 75. The oly 45 feels cheap but is nice, the 75 feels like a billion bucks.

If time permits use the fast leica and oly primes.

Shoot a little under exposed and a less saturated setting with less red in custom seting

The IOS on the pana lenses is ok, but it is not steadicam smooth.

Watch your shutter speed. On my RED's I can go at 30 fps to 125th of a second without that strobing look, but the gh3 really is a intra file and reacts like film frames.

Keep the camera small. You'll need ND's so either go to the 5 series lee system or screw in tiffin (use tiffin as they are very good) .

Good luck.

BC

Thanks for your detailed reply BC but how about DR? How does it compare with your Red? I have the BMCC which seems equal to the non dragon sensor in DR.

Cooter have you tried the newer (perhaps still beta) versions of RedCine-X that use GPUs for debayer? It's way the hell faster than past versions if you have to live without a rocket. My MacBookPro Retina is playing back 5k Epic footage real time at 1/2 debayer off the Mags.

Cooter have you tried the newer (perhaps still beta) versions of RedCine-X that use GPUs for debayer? It's way the hell faster than past versions if you have to live without a rocket. My MacBookPro Retina is playing back 5k Epic footage real time at 1/2 debayer off the Mags.

CB

Thanks CB.

Funny you mentioned it. Two RED rockets have c__ped out and are on their way to RED with a support case number, so I'm running naked on a new 27" I mac. It runs about 1/2 as slow as with the Rocket.

Oh . . yea I have the newest cinex, went back to older, went up to newer, just did the newest of the newest, reflashed the rocket about a dozen times . . . spent more time on the damn rocket than I would if I'd just run it on the Imac, so tonight the Imac is crunching away hopefully all night.

I like RED, they responded to me in minutes, but still no resolve and the only suggestion was wait for the rocket X and after 12 grand in rocket cards, I think it'll be a very cold day in Miami before I buy another one, X or not.

Thing is for the price of one Rocket you can almost buy two I macs and run. Something wrong with this business model.

Damn, I wish these cameras shot a usable prores.

In fact when you go deep into discussion about the RED's 90% of everyone, converts the raws to a useable format and never again goes backwards to the raws.

That tells you something about the raw world and yea, I thought raw was better, today I'm a little unsure.

Thing is, today there is not a lot of place to go. Red was working on a module that produced prores but that feel quiet and if they make it I think it's about 15 grand.

We're going to have cameras soon that shoot a 422 4k file that will probably sell under 5 grand soon, so 15k to convert to prores may be too late and regardless I do want RED to succeed, I just want to get away from build #20, Firmware number 17 and well you get it.

What is strange is on the RED one the semi red prores files are beautiful and sharper than the raws. But batching them out is brutal, unless somebody knows a method I don't.

I'm not that big on the raw world anymore, because if you match up cameras, hit your settings and white balance, you gotta grade anyway, so why go through the extra step. I'll bet 90% of everything shot with an Arri is prores and not raw.

Really the F3 and F5 you should be taking them for a drive. Slog basically turns off the Sony look then you get edit ready codecs, sound, all sorts of stuff, I have a grudging respect for their cameras and see why 80% of broadcast is done with their stuff.

The Sony business model isn't set up for me, to them I'm just a small producer, they're looking at millions of consumers or 245 international networks, but not me.

The fs100 to 700 was the perfect example. They could have gone 4k without such a huge financial hit to the buyer, made a higher bit depth file, expanded the E mount line, especially in light of the Z series and gotten away from that awkward avchd wrapper but they didn't they brought out newer cameras and left the e mount guys hanging.

Sony also changes media like I change socks.

Red is not perfect but they got back to me last night in 5 minutes on the phone twice and in multiple emails. 5 minutes.

No camera company has done that.

Red also has a base coloring suite, not perfect, changes a lot but they have one, along with monitors, media, viewers, breakout boxes, cages, everything it takes to go to work, without having to dig around for a week on the web.

Their cameras aren't perfect, but my R1's have already lasted longer than my fs100 that sets in a case, the R1's and Scarlet are working this week.

The R1's have made us serious money and not only impressed on set, have impressed in final delivery.

Yes I hate this raw file render process, but at least I can line up 200 clips base out the color and go to bed knowing I'll have prores in the morning ready to roll, even if the rockets aren't working.

Now If I was moving to something else it probably would have been Canon because they sort of get it. The 1dc shoots 4k and I like a dslr form factor, but it shoots a screwy codec and is limited on recording time at 4k which wouldn't have worked. At least Canon serves me on both ends stills and motion and Sony doesn't.

Panasonic can be just as screwy, though they seem to be on to something with the gh series if not for cost alone and when it comes to run and gun, the gh3's with their autofocus fits a niche I need the gh4's will expand upon that without throwing out all the lens mounts.

We're at a tipping point and Somebody needs to fill the niche, with not only motion cameras, but a complete motion to still solution that covers the on set and back end workflow.

Canon could do it, they're close, but they tend to mess around. Arri won't do it, they're a movie cam company, but right now for what I do they probably are the best solution except for costs, the Arri's make RED look cheap.

Also even though Arri is logical taking a 2.5 k file and making it 2k with a lot of depth, unlike red that takes 3.8 k and makes it 4, it's still going to be a 4k world whether I or anyone like it or not.

My bottom line is no more 10 grand and over cameras. What I have is on movie sets around the world and RED takes the back end response very seriously because they know in the land of motion, there is no redos.

RED is a love/hate company, I fall in the former, but my love is tempered by reality.

Output 10bit to external pro res recorder - (all with the same cable professional SDi)EX1EX3F3F5F55

Batteries, BPU60 or 12v inputEX1EX3F3F5F55

Sound, Stereo XLREX1EX3F3F5F55

Stereo HF jack,EX1EX3F3F5F55

Seems like you don't change your socks enough

BTW they all have built in NDs too.

As for the NEX line - well that is a bit of a mess indeed.

Im quite with you, no more $10g cameras - thats why I bought the F3!

S

I can put a prorezz recorder on the gh3's but adding weight somewhat defeats the purpose. I can also do the same with the RED's but the output is limited.

Sony is just a funny company and I know you dig their equipment, but they seem to just be everywhere in their line up.

During the last Zacuto shoot out with the F5, Sony was the only company that required they graded their own footage, through technicolor and the results weren't stunning.

That stuff freaks me out.

Anyway, it's kind of a mute point because I've spent the money for 4k and will live with it. Now, if Canon's c100 with autofocus shot 4k or 3.8 k in a prores format, I'd be seriously interested, but they don't they won't and life goes on but Canon is goofy with their C line up having dedicated mounts. Makes no sense.

Sony, I just got too burned with the fs100 and I didn't lightly glance over that camera, I busted ass to make it work and it had way too many issues with the file.

I respect what you bought, it made sense for you, maybe for me if I already hadn't made the investment in RED, but I have and I'll continue.

Yep you have money in R1 and Scarlet - great cameras. Sometimes I think you should swap out for a couple of F5s, sometimes not.

I think it is certainly worth you trying a grade on a proper 10bit file - when I have one I might send it to you.

I feel that the proper 10bit codec (not 8bit in a 10bit wrapper as I guess the GH2 does) is not only quick on the workflow but very very deep in how you can grade (4xmore colour info), I have been describing it as "as good as raw but without the exposure slider"

Testing the F5 I shot a bunch of daylight on tungsten (on purpose) and then graded it back to daylight absolutely no problems

This 10bit stuff is really a codec worth checking as it presents a very good balance between muzzovision and raw.

I do a very rough one light in RED CINE, import to FCP-X which keeps pointers to the RED metadata files so if I make a change in RED CINE, then hop back to FCP-X, the change is instantly implemented.

If I think the edit is going to be heavy duty or I need to do it on a laptop, I render Proxies when I import to FCP-X, which doesn't take too long although admittedly I have a screaming system (12 core 2.93 MHz Mac Pro, dual GPU, striped SSD for work drives, RAID-1 2x4 TB media internal drive). Playback is just fine with 4K material. Editing via proxy with an older MacPro or MacBook is fine, and I just connect to the main Mac media drive come export time; I often have both Macs exporting at once. The load on the drive is fine since the whole thing is CPU bound not IO bound.

Since I need to turn around two short films per week, I generally grade by doing one lights in RED Cine, fine tuning clip to clip once the edit is done. Any additional mucking about with secondaries I do with FCP-X colour boards, plus a few plugins (Magic Bullet, FilmConvert primarily). Then just export as ProRes at the end of the sequence, so I only end up rendering the frames I'm actually using. I render final output (web MP4 etc) from the ProRes Master using Compressor which can chug away overnight if necessary. Then archive the FCP-X events, projects and final export ProRes.

When I have the leisure for a longer grade, I export the project to daVinci Resolve and finish there- round tripping is pretty slick.

No more screwing around with ProRes intermediates, always grading from the original is a great way to work.

I obviously don't like X, don't know premier either, but liking and using are different. I've cut one small effects edit on X and it was ok but an easy edit.

I have arranged for a fcpX tutor to come to our studios , when I return to LA if i get a week of downtime and walk me through it, because I'm never going to learn watching online videos.

The real problem I have with Premier and X is every outside source I work with, (and this covers a lot of territory) either works in 7 or avid. I've yet to meet an editor that is working in premier and X, unless they are in house working closed loop.

Not saying x or premier isn't good, It's just today they are not the standards 7 and avid are.

I do a very rough one light in RED CINE, import to FCP-X which keeps pointers to the RED metadata files so if I make a change in RED CINE, then hop back to FCP-X, the change is instantly implemented.

If I think the edit is going to be heavy duty or I need to do it on a laptop, I render Proxies when I import to FCP-X, which doesn't take too long although admittedly I have a screaming system (12 core 2.93 MHz Mac Pro, dual GPU, striped SSD for work drives, RAID-1 2x4 TB media internal drive). Playback is just fine with 4K material. Editing via proxy with an older MacPro or MacBook is fine, and I just connect to the main Mac media drive come export time; I often have both Macs exporting at once. The load on the drive is fine since the whole thing is CPU bound not IO bound.

Since I need to turn around two short films per week, I generally grade by doing one lights in RED Cine, fine tuning clip to clip once the edit is done. Any additional mucking about with secondaries I do with FCP-X colour boards, plus a few plugins (Magic Bullet, FilmConvert primarily). Then just export as ProRes at the end of the sequence, so I only end up rendering the frames I'm actually using. I render final output (web MP4 etc) from the ProRes Master using Compressor which can chug away overnight if necessary. Then archive the FCP-X events, projects and final export ProRes.

When I have the leisure for a longer grade, I export the project to daVinci Resolve and finish there- round tripping is pretty slick.

No more screwing around with ProRes intermediates, always grading from the original is a great way to work.

Cooter have you tried the newer (perhaps still beta) versions of RedCine-X that use GPUs for debayer? It's way the hell faster than past versions if you have to live without a rocket. My MacBookPro Retina is playing back 5k Epic footage real time at 1/2 debayer off the Mags.

CB

One crazy thing I did today was going through some stills in lightroom I hit an RED folder and could see an R3d file in all it's glory, so I imported it and it took about 1 second to import and could play it out in what looked like full debayer in realtime with sound.

I tried to export it from lightroom but that didn't happen, though if you want to view and view easily without messing with all the cinex or red player stuff, try lightroom 4.4 for viewing.

As long as Cinex won't have secondary capabilityIt's a pitty but there are ways to overcome the limitation.

I can't think of a more flexible and clean workflow usingThe way Red structured the files and cleverly separatedThe metadatas with the RMD files.

Because: setting the source settings using RMDs means That we can affect the edit in real time without touchingThe Timeline nor roundtrip to a color app and it worksEven on underpower units.

And....(roll drums)....It's possible to do secondaries. The trick (and it's way fasterThan roundtrippin) is to work on layers where necesary.The base edit has the primary correction assigned with one rmdWhile you can add layers above the edit like we'd do in PSUsing the collapse (in fcp 7 "collapse has another name IDon't remember), and assign rmds that correspond toThe 2nd color corrections but with tracked masks andAs much as you want, then it's mergin parameters etc...

Not touching the edit and only using cinex as a full color appYou cover all territories with just applying RMDs.And no roundtripping to anything but just your nleAnd just cinex opened.

Then...(roll drums 2)....The very big advantage of this approach, a part from consumingAlmost no mem, is that batch application is a kid game.Many times, a particular color correct will have to beApplied in different parts of the edit. If you have doneA good bin organization, it's really fun how you can justHit a button and bang...all the desire parts of the editCorrected at once.

No roundtrip, no resolve, just NLE workflow. Clean.

What Red did best IMO is to have separated the metadatasFrom the Raw, and that's not available in DNG.

The workflow I described is doable in Avid and probablyDoable in PP if it has tracking mask capability withoutGoing to AE. Probaly doable in FCP7 if it has the set sourceSettings options but I don't know about it.

To recap the entire workflow from the beg:

- I do my edit with proxies if long- when edit booked, I relink all to the R3Ds (10 seconds)- I assign the primary CC assigning R3Ds to the correspondingFiles using the bins (not touching the edit)- when secondaries are needed I collapse the segments,Create layers with the duplicated footage and assignThe corresponding RMDs but this time with tracked masks.As I can control the merging the possibilities are infinite.

It differs from a color app workflow in the sense thatInstead of coloring, what you actually do is mergingDifferent versions of the same image. But it leads toThe same goal: affecting the colors the way you want.The edit remains untouched and if you open the projectA year later, you picture what's been done immediatly.Doing versioning is also very easy and flexible.

All you need is: an NLE that can work with RMDs andHas basic compositing tools.