joachimhedenworkbloghttps://joachimhedenworkblog.wordpress.com
Random thoughts on filmmaking, and such...Thu, 22 Feb 2018 04:46:52 +0000enhourly1http://wordpress.com/https://s2.wp.com/i/buttonw-com.pngjoachimhedenworkbloghttps://joachimhedenworkblog.wordpress.com
“Framily” on Vimeo on demandhttps://joachimhedenworkblog.wordpress.com/2015/12/22/framily-on-vimeo-on-demand/
https://joachimhedenworkblog.wordpress.com/2015/12/22/framily-on-vimeo-on-demand/#respondTue, 22 Dec 2015 08:17:06 +0000http://joachimhedenworkblog.wordpress.com/?p=86Finally after – first being “locked in” by an unfortunate sales agent agreement, and then loosing some urgency… finally my second feature “Framily” is available everywhere on Vimeo’s on-demand service.

]]>https://joachimhedenworkblog.wordpress.com/2015/12/22/framily-on-vimeo-on-demand/feed/0jacknusaHow to stop a weddinghttps://joachimhedenworkblog.wordpress.com/2015/12/16/how-to-stop-a-wedding/
https://joachimhedenworkblog.wordpress.com/2015/12/16/how-to-stop-a-wedding/#respondWed, 16 Dec 2015 08:33:59 +0000http://joachimhedenworkblog.wordpress.com/?p=69Continue reading →]]>Too much tech mumbo jumbo on the blog lately – time to get back to films and filmmaking.

My friend Drazen Kuljanin’s debut feature “How to stop a wedding” is available on Vimeo on demand.

He shot the whole film – minus a short intro and a brief coda in the 5 hours and 17minutes it takes to travel by train from Malmö to Stockholm (In Sweden). Yes, you read it right – a whole f**king feature made in 5 hours and 17 minutes.

I don’t blame you for thinking it can not possibly be any good – but really it is. It’s a beautiful little film that made me cry a little from happiness at the end both times I’ve seen it in the cinema.

Drazen rehearsed the film like a theatre play with the actors, and then they boarded the train, shot each scene twice with two cameras rolling at all times. Boom! there it is, a full feature.

If for no other reason, watch it for inspiration – if you have a good idea, you can make it happen, and all it takes is 5hours and 17minutes

Watch it below:

The film is in Swedish, but with English subtitles. (Oh, it’s also a little NSFW).

– on this subject and I thought I would chime in if someone is interested. And since I have a background in printing, I might be able to explain this in a slithtly different way than video-only people… perhaps?

So, here goes:

Yes, you can absolutely substitute bit depth for resolution – open any glossy print magazine and flip to a beautiful color ad, and you are looking at 1bit color – yes ONE-bit color. But with an extreme resolution. More on this below.

And yes, in theory 4K 8bit video can absolutely represent the same color information contained in HD 10bit video. But, the codec has to be geared towards this purpose.

Consider a 10bit HD pixel with a value of 1022 – this is just shy of the 1023 maximum value on the 0-1023 scale – a small, but important difference.

That same pixel in an 8bit system would be represented by a value of 255 – the Max value on the 0-255 scale. And, if this pixel were put into a 10bit container, the value would be 1023 – one off from the ”True” 10bit value.

However, suppose we upped the resolution of the 8bit system from HD to UHD. Now that single HD pixel is represented (spatially) by 4 8bit pixels. So in it’s simplest form – if the signal processing and codec is working at a higher bit depth right up until the actual ”number” encoding, it could encode the 4 pixels as 255, 255, 255, 254. So when the 4 UHD pixels are down sampled (averaged) into a 10bit HD pixel – it’s value would be the correct one: 1022.

However, if the signal processing/codec is not geared towards encoding the UHD this way – it’s all for nothing. If the codec simply encoded each UHD pixel as rounded to the closest 8bit value, you would end up with 255, 255, 255, 255 – and when this is downscaled into a 10bit container you end up with 1023 – not the ”true” 10bit value.

So, in summary, in theory FOUR UHD 8bit pixels (4 containers of 256 values) can hold the EXACT same amount of color information as ONE HD 10bit pixel (1 container of 1024 velues) – but everyhing in the signal processing/codec has to work towards this goal.

If this is true of real world camera/codec implementations or not, I’m not sure – this is another discussion, and I’m sure you can dig deep into this.

So, what about printing being a 1-bit color system…?!? Those glossy ads seem to have infinite color…

Have you ever been next to an offset priner?

It has 4 vats – one for black, one for yellow, one for cyan, and one for magenta. That’s IT – no shades or gradiations. And the colors are NOT ”mixed” per se to provide different shades (like you would see at Home Depot paint department where a machine will mix an exact amount of this, that and the other to provide any shade you want.)

No, Offset printing (the most common method for high quality printing) is a 1bit process, but working at an extreme resolution. To represent different shades of color, a RIP (Raster image processor) transforms a color image of any bit depth into a 1bit (per color channel) ”raster” that gets printed on the paper at a very high resolution. The raster can be either ”AM-modulated” where dots of varying sizes are printed on a fixed grid, or it can be ”FM-modulated” where a fixed size dot gets printed as varying intervals, or it can be a mix of the two methods (AM/FM-modulation).

But the point is: it’s a system where the RIP (which could be compared to the CODEC in video terms) substitutes Bit depth for resolution.

A color image that goes to print usually has a resolution of 300dpi color which gets ”RIPPED” into a 1bit raster at typically around 4.000 dpi.

So, to get back to the original question – can 4K 8bit video represent the same color information as HD 10bit – the answer is yes, if the codec and signal processing is geared towards this purpose.

]]>https://joachimhedenworkblog.wordpress.com/2015/12/15/can-4k-video-represent-the-same-color-information-as-10bit-hd/feed/0jacknusaThe curse of the “rough cut”https://joachimhedenworkblog.wordpress.com/2015/10/29/the-curse-of-the-rough-cut/
https://joachimhedenworkblog.wordpress.com/2015/10/29/the-curse-of-the-rough-cut/#respondThu, 29 Oct 2015 13:42:13 +0000http://joachimhedenworkblog.wordpress.com/?p=60Continue reading →]]>Recently I’ve been somewhat involved with a few “problematic” projects where I think part of the problem lies in the idea that a “rough cut” or a “first assembly” is the best first step in the editorial process.

What?!?!?

That’s how things are done, always! Are you mad?!?!?

Ok, let me explain.

The problem is that “first decisions” tend to stick. Or, to put it another way: The DNA of that first rough cut will most likely be present in the final product. Rarely is the first rough cut completely thrown out after using it to assess the story and then start from absolute scratch, and even if it were, the impressions and feelings of the material that you get from watching that first assembly will inform your decisions down the road. A much more common scenario is that the rough cut is used as the starting point for an incremental process.

So – don’t be flip about putting together that first assembly. In many ways it’s more important than the final tweaks.

There were some arguments over the dynamic range of the new C300mk2… I shot a test series just to satisfy my curiosity. Pictures are below. You can draw your own conclutions if your’re so inclined.

The first image in each series has the right most patch on the cusp of white clip.

The second is with 10 stops of ND – 1.8ND internally and 1.2ND in front of the lens AND 1/2 a stop down on the lens for a total of 10.5 stops down from the first picture.

The third is another full 2 stops down on the lens for a total of 12.5 stops down from the first picture.

The fourth is the same as No.3 but “graded” so that we can “see into the dark”.

The F5 was shot Slog3, The C100 Clog, and the C300 mk2 both Clog and Clog2. No noise reduction applied anywhere. Footage imported into Premiere Pro, titles done there, exported as Prores HQ, and then “graded” (only the 4th pic in each series is touched) in DaVinci.

The Canon EOS 1DC is on the horizon, and the first native footage has appeared online courtesy Nigel Akam. http://vimeo.com/44552135 (Download link to the native file is on the Vimeo page).

My first concern when I heard about the 1DC was that it was going to be an 8-bit format. Got a sinking feeling, but then Shane Hurlbut stood up and claimed that the material graded like nobodys business. So, I decided to bring the footage into Resolve and see how it holds up.

Now, since I feel 2K/1080 is all we’ll need for the foreseable future, I decided to scale it to 1080 in AE (32-bit float) before bringing it into Resolve.

First I just did a straight scale and transcode to ProRes HQ422, and brought that into Resolve.

First thing that was apparent was the flatness of the image, and that it was positioned quite a bit “to the left”. See pic 1 below from Ultra scope. From what I understand, this footage was shot in “Canon log” and while it certainly lifts the blacks and protects the highs, in an 8-bit format it’s not necessarily a good thing to shoot flat (the underlaying math we’ll save for another day). A better “spread” and shooting more “to the right” is usually better with an 8-bit format.

And, when spreading the image in Resolve (and pushing the mids a bit) to get some kind of base grade, the Ultra Scope histogram shows quite a bit of banding. Pic 2 below. Now, my subjective impression of the image was still that it it was holding up pretty nicely on the grading monitor.

So, to look a little bit further into this, I decided to do the “spread” in AE using “levels” (Yes it keeps the 32-bit float) prior to the transcode to ProRes HQ422. Pic 3 below.

Now the footage has a better starting point in Resolve. Pic 4 below. And when graded to match the previous grade, the banding situation is a lot better. Pic 5 below.

But, then to see if I could improve things even further, I decided to try transcoding to ProRes 4444 in AE with the same levels “pre spread”. Bringing that into Resolve and matching the previous grade results in a very smooth histogram. Pic 6 below.

The point of all of this is that I think we’ll see a lot of opinions about the gradability of 1DC footage, but the question will be if people have done their home work on their image pipeline…

The obvious last test was of course to bring the native 4K image into Resolve and see how that held up. Pic 7 shows the histogram of the native file with a matched grade. Interestingly enough, the histogram shows that this approach lands us somewhere in between the 422 and 444 “AE pre spread” images. It seems some good magic is going on in the AE to ProRes 4444 scale/transcode.

The downside of putting the native file into Resolve is of course also that it won’t play back in realtime.

So, my subjective impression…? It feels that the footage (especially in the 4444 transcoded version) holds up really really well. This may not be the best test image, and It would be interesting to see some skin tones, and some darker flat areas, but, I’d say things are looking rather good. And, I have a feeling it may hold up even better if shot a little less flat and more to “the right”.

Pics below.

Please feel free to re-post, but please name the source as “Way Creative Films, Sweden”

]]>https://joachimhedenworkblog.wordpress.com/2012/06/29/framily-trailer/feed/0jacknusaAbout Kickstarterhttps://joachimhedenworkblog.wordpress.com/2012/06/01/about-kickstarter/
https://joachimhedenworkblog.wordpress.com/2012/06/01/about-kickstarter/#respondFri, 01 Jun 2012 04:00:39 +0000http://joachimhedenworkblog.wordpress.com/?p=24Continue reading →]]>Some people I know professionally a little bit is/was running a kickstarter campaign. I pitched in my $25, more because I’m curious about the process that intrigued about the film. It seems they’ve been running a very nicely managed campaign with some well timed press coverage. But, I have to say, there seems to be quite a lot of work involved – both at the front- and at the back-end of this. There may indeed be some intangible benefits, and I don’t know the in and outs of it all, but I’m left with a feeling that there must be an easier way to lay your fingers on $50.000. Or not?

Oh, and at 5% of the take, Kickstarter must surely be a pretty profitable website…

…and got quite worked up about it. And, got into a somewhat heated argument with “CJ” about it:

——

ME

I’ve been saying it for as long as I can remember. Increased framerates – real or interpolated (as on your brand new 600mz TV) – may be great for sports, but it is DISASTEROUS for cinema. Please share the link, and then call your TV dealer for a refund on your 600mz TV. Maybe there is still time to put a stop to this madness!!!

CJ

Sure, things like shallow DOF and color grading can be beautiful while taking us further from reality. But I’ve never understood how less choppiness can be a bad thing. I’m not saying you’re wrong, because clearly this is what you’re truly feeling. I’m just saying that the ONLY reason why some people think frame rates over 24 look “cheap” is because it makes them think of soap operas. It’s not the smoothness itself that’s bad, it’s the associations. If you were to show a 24 fps movie and a 48 fps movie to someone from the 17th century, I can’t think of one reason why they would prefer 24 fps.

ME

Ah, Cj,
But I DO think the person from the 17th century would “prefer” 24fps – But I’ll have to write something longer to expand on this, and I’m thumbing it on my phone right now. Give me a few hours…

CJ

Serious question: Do you think a live theater play is too smooth?

ME

No, but it doesn’t contradict my opinion on 48fps – but like I said, I need to be in front of a proper keyboard.

ME

First – does a 48fps image look “better” as in being a more true depiction of what is in front of the camera? YES absolutely. This is why it is great for sports, and this is exactly why it does not work for cinema – cinema as in dramatic storytelling. The fundamental principle of why cinema works, is the “willing suspension of disbelief”. This is the contract between the film and the audience wherein the film says “let me tell you a story” and the audience says “ok, that sounds good, for two hours I will ignore my intellectual knowledge that what you will show me is not real”.
This willing suspension of disbelief is infact aided by any layer of abstraction that keeps the telling of the story, just that – a story. The “imperfections” that a 24fps image provides is such a layer of abstraction. The more “real” the picture becomes, as with 48fps, the harder it becomes for the audience to ignore the real truth – that it is all a scam. The characters are only actors impersonating the characters of the story in environments that are staged for the camera. (No disrespect to all the actors out there that I love all equally).
As to your comparison to theatre, there are other mechanics at work to aid the willing suspension of disbelief, but I would argue that theatre, more that cinema is an acquired taste. I do believe that theatre requires more work on behalf of the audience to fulfill their end of the contract, to willingly suspend their disbelief. So, would theatre be more accessible with a slightly blurry stutter? Yes, possibly.

ME

To put it simpler – at 24fps you’re seeing the character, at 48fps you’re seeing the actor.

ME

And, just to there is no confusion here. This has absolutely nothing to do with digital vs. Film. 35mm motion picture film shot at 48fps and projected at 48fps has the exact same effect. Yes, I have shot this as a test a few years ago.

CJ

I see your point! What I don’t see is what makes frame rate so different from any other arbitrary parameter that could make a movie less “real”. For example, if you watch an anamorphic image that’s not projected correctly, it’s less like reality because it’s stretched out. Does it make it more magical? No. But if that had been the only way we had watched movies for 100 years, most people would probably call it “part of the cinema experience”. Of course it’s a silly example, but still a valid one I think.

I remember watching action scenes as a kid, often confused because I couldn’t really tell what was going on. Partly because of VHS “quality”, partly because of pan and scan, and of course partly because some directors like to “hide” their boring action behind close-ups and camera shakes. But also because of the frame rate. When I see a fairly fast landscape pan on the big screen, I’m often too distracted by the jittering to actually enjoy the scene. I can’t speak for anyone else though.

CJ

I agree that watching a movie on a TV that makes it look smoother than the director intended is a bad thing.

I’d say I won the argument, at least a little bit;-)

]]>https://joachimhedenworkblog.wordpress.com/2012/05/30/on-48fps/feed/0jacknusaThe Steve Jobs Biographyhttps://joachimhedenworkblog.wordpress.com/2012/05/30/the-steve-jobs-biography/
https://joachimhedenworkblog.wordpress.com/2012/05/30/the-steve-jobs-biography/#respondWed, 30 May 2012 04:09:35 +0000http://joachimhedenworkblog.wordpress.com/?p=14Continue reading →]]>So, I recently finished “Steve Jobs” and I have to say, that’s not going to be an easy adaptation. Read somewhere recently that Aaron Sorkin (confirmed as screenwriter for the project) seems to not yet have found the way into the material. If he balks, apparently this wouldn’t be the first time he comes up short with a S. Jobs related writing assignment (Jobs asked him to write that commencement speech (stanford?)), but he didn’t do it.