Sometimes when my box can playback high resolution clips smoothly using programs like Windows Media Player, but then editing them in HF becomes quite a challenge even without any effects or adjustment.

To answer your question, no, HitFilm doesn't use or take advantage of Intel's Quick Sync feature. This feature isn't all just pros though, it reduces output quality quite a bit, in exchange for speed (faster renders).

I don't see them implementing this into HitFilm anytime soon, mainly because their stance on equality among consumer hardware is clear, they don't want either AMD, NVIDIA or Intel to have an advantage.

The advantages will still be there in the form of faster processors (Intel is way ahead of AMD), or NVIDIA having a massive advantage in the high end GPU market, as AMD can't even compete right now, they didn't launch any products in the high end market.

Lastly, you can't compare a dedicated video playback software to a full blown NLE and VFX software.

You definitely want constant bitrate, and you definitely want an editing codec, you don't want to edit H.264, that's a delivery codec.

Your laptop isn't the best, but not the worst either by any stretch, and as far as I'm concerned, HitFilm is not doing the playback on the GPU, that's the CPU and disk speed, though if I'm wrong someone is going to correct me, so don't think that is a fact, please.

Variable bitrate is not a problem for editing. It is variable frame rate that is the problem for editing.

4K/UHD is very difficult to edit. It will require a powerful computer to do so in real time for many tasks. For 4K you probably want 8 cores to keep things smooth across transitions and such.

AVC is probably the highest overhead codec to decode there is. Your file looks like it might be coming from a GoPro (GOP : M=1, N=8). GoPro AVC files are more difficult than most. I don't know why but that is just my past experience.

You can transcode the source AVC file to Cineform with GoPro studio software (free). It should be easier to edit. Or you can transcode to an easier to edit AVC file by using Handbrake and the "fast decode" option.

@LoYukFai Video decoding is generally CPU bound without any hardware acceleration like QuickSync and you're not likely to see QuickSync assisted decoding in most editing programs anytime soon. QuickSync decoding acceleration actually requires an active Intel GPU in order to work. That means if you have an Intel CPU but are using a a discrete GPU from Nvidia or AMD then QuickSync decoding is useless.

As to why you're seeing stuttering it's because you're working with 4k/UHD video. It's just harsh to deal with and the minimum requirements for 4k work are pretty high. The lowest stated minimum requirement for editing 4k that you're going to see is a quad core CPU with 8 or more cores being the recommended spec and hyperthreading logical cores don't count towards the number of needed cores. You have a two core processor so you're really just maxing out what it's capable of decoding 4k AVC video. Transcoding to Cineform like NormanPCN mentioned will help but realistically you might have to switch to a proxy, offline workflow.

Hitfilm doesn't use any hardware specific acceleration--no CUDA, no Quicksynch (Quicksynch isn't very good, actually). Video decode is done by the CPU and render to screen by the GPU.

Unfortunately your machine has a low-mid level processor and the integrated GPU is near the minimum required.

As Norman stated, editing 4k smoothly requires a high-end computer. Here's a couple of numbers: your video is encoded at about 60Mb/s an uncompressed stream is 5690Mb/s. Your computer is having to unpack from 60 to 5690 on the fly, and draw that to the screen, AND scale it to fit the relative panel. This is using massive amounts of computer resources.

As Norman said your footage is packed in a format that is difficult to decode. You could improve editing speed by transcoding to an intermediate editing format like DNxHR or ProRes. However, this will greatly increase file size.

It is possible to have the Intel GPU active with an AMD/Nvidia GPU also. Even without a monitor attached to the Intel GPU.

All the GPUs (AMD, Nvidia, Intel) have hardware AVC decoders built-in these days. However these decoders are designed around basic video playback used by video players. A single video stream. Video editors have greater needs, such as simultaneous decode of multiple video streams (transitions, cuts, compositing).

@NormanPCN Actually that's not quite good enough for QuickSync decoding. The Intel GPU must be the primary GPU, used for an extended display or switched to on the fly with Lucid Virtu. Win 8 and higher will support a headless or unconnected iGPU but that's impossible with Win 7. Also if you are using Lucid Virtu to switch from a discrete GPU to the iGPU, the discrete GPU will be disabled for the duration of the decoding.

@Aladdin4d In the distant past, I have had a headless setup on Windows 7, but I was doing that to use the Quicksync encoder via Sony AVC in Vegas. I had an AMD main GPU at the time. I never thought about or tried the QS decoder as nothing I had tried to use it.

There was a process to get "headless" to work on Win7. At least for encoding. AFAIK the Intel GPU driver thought something was there or you could make it not care something was not there. Not strictly headless but it can work.

There was a process to get "headless" to work on Win7. At least for encoding. AFAIK the Intel GPU driver thought something was there or you could make it not care something was not there. Not strictly headless but it can work.

That's a virtualization technique and Lucid Virtu is the original Intel approved method. With Win 8 and up you're still using virtualization but you gain the ability to have a hybrid solution of a QuickSync decode and a discrete GPU render but you're not necessarily gaining anything in reality because it really depends on what all is implemented in the virtualization. In Win 7 you're stuck with the iGPU doing everything shutting out the discrete GPU during the process. These kinds of things are only really going to turn up trying to decode using QuickSync.

Encoding is different with it's own twists and turns. The difference being Intel assumed content creators that would be interested in using QuickSync to encode would be much more likely to use a discrete GPU so they made it a little easier to deal with.

Intel's integrated GPUs aren't all that impressive, though they're reaching a level of adequacy. You're probably better off with either a better processor or one of those Razer machines that gives you the option of adding a performant GPU via a RazerCore later.

Intel Quick Sync is now supported by many Professional NLEs now including Vegas Pro, , Edius 8 and DaVinci Resolve 12.5. They wouldn't use it if it was inferior. Maybe Hitfilm should have a closer look.

"Movie Edit Pro 2017 Plus, the newly released product by Magix, allows filmmakers to deal with the challenges that 4K 10 bit HEVC video and 360 video format present. Hear how Magix succeeded by partnering up with Intel, which allowed them to enable their product for the latest video coding innovations embedded within the 7th Gen Intel Core processor family, and which empowered them with the most complete software development toolset."

That video is a bunch of hype. Hitfilm does use the GPU fully for graphics.

Quicksync is a special special case thing, that does not do very much beyond a file encoder (AVC, HEVC) and that encoder is not as good quality wise versus what we have. Faster yes. At high bitrates things are fine but we only use that for intermediates and Hitfilm has better true/real quality intermediate options these days (Cineform, Prores).

Quicksync for decode is not a real thing for editors. Fine for a media player, and maybe a transcoder, but not for an editor with multiple media file data streams in flight at the same time.

Norman - agreed but many of us shoot and edit H.264 including 4k in 8bit at average bitrates and we want H.264 out which is compatible with most media players, Youtube, and looks great on my SUHD TV.

I'm using latest Express but I haven't seen how it can transcode to Cineform intermediate codec for editing. This could improve 4k timeline performance as Cineform is gpu optomised and H.264 is not an edit friendly codec.

Hitfilm Express doesn't have tools to transcode to Cineform (Pro 2017 can export Cineform), but can import.

Gopro Studio (free) transcodes to Cineform.

You've missed a key point in this discussion. Quicksynch is a much faster encode, but it's a much lower quality. Every independent test empirically demonstrates that the output quality of Quicksynch encoded mp4 is far inferior to basically every other encoder. You'll have worse color and more aliasing artifacts. Period.

Remember, press releases are commercials, and commercials shade things to sound as good as possible.

The first post in this thread links to articles discussing why mp4 is terrible for editing, and this thread has optimized settings if you insist on mp4.

@FishyAl"This could improve 4k timeline performance as Cineform is gpu optomised and H.264 is not an edit friendly codec."

How so? Cineform does not use GPU for decode/encode from anything I have seen. Video decode is not a massive parallel task and thus does not fit well with GPU compute.

Yes, GPUs these days have fixed function, single stream, AVC and HEVC decoders but those are often slower than good decoders on a good CPU, like libavcodec. Sadly many decoders are not so good performance wise. Also, not everybody has a fast CPU.

Triem23 - thanks for the links and advice. I use Handbrake but scripts are over my head. I also use GPStudio and Cineform codec. GPStudio won't convert my Sony H.264 files but I can re-wrap them to mp4. Can Express import the Cineform AVI files if Pro 17 can't?

I fully understand that h.264 mp4 is a long GOP highly compressed format and not edit friendly. It is, however, the most common camera format and therefore a challenge for all edit software due to the ever increasing popularity of UHD/4k. h.265 is even worse. To avoid the need for third party conversion software it would be nice if Hitfilm could include the options to transcode to a user selected codec or lower res proxy files prior to edit - like DaVinci Resolve 12.5 has done.

What codec format is used for the Hitfilm optomized media option?

I accept your point about Quicksync. The Anandtech link was from 2013 so I'd like to research it further. My understanding is that latest Quicksync is better and faster. I'm busy upgrading my PC to a Kaby i7-7700k with an Asus Prime Z270-A m/board, 32GB DDR4 3200, a 512GB Samsung 960 M.2 SSD and 8TB raid 0.

I'll also test Magix with my Kaby to see if the video is "a bunch of hype" as Norman suggests. Personally, I see significant benefits to the new Intel gpu encoding/decoding abilities for media software.

I love Hitfilm and will make Pro my main editor once I'm convinced it can handle my h.264 and 4k.

PS - If you get a chance, try a new intermediate codec called MagicYUV. It's 100% lossless and the fastest codec. Works on the timeline of most NLEs that use VFW or QT (except Resolve). I'm using it to edit my 4k on my old core i5 with realtime highest quality timeline preview. Even handles 10/12/14 bit color depth. Free version has a watermark. Full version is $14 or any donation you choose - including none.

Kaby Lake can handle H.264 a lot better than you'd expect, but it still pretty much sucks. I'm a bit spoiled though; I'm using a 16-bit camera with ludicrous dynamic range and stunning color rendition nowadays.

NormanPCN GPUs actually are quite good with video. It parallelizes well because it's well defined and structured. It's SIMD friendly, so it works well with explicitly parallel implementations. It's due to that that Resolve, Mistika, Scratch, RedCine-X Pro, etc are GPU-oriented.

With QuickSync I understand the hype, but never ever look for demonstrations in an Intel ad and take the information at face value.

Their QuickSync claims seem nice, but you cant run Intel Integrated graphics and a dedicated GPU at the same time, HitFilm will always choose the most powerful graphics card in your system.

It may be a viable option for people not needing the more accurate encoding option. I just can't wrap my head around why someone would use it though. A dedicated graphics card is built from the ground up to render stuff, wouldn't a dedicated run circles around integrated graphics when working on effects and when exporting?

@WhiteCranePhoto NormanPCN is right on this one. Video decompression/decoding is CPU bound even in the apps you mentioned. Debayering is now handled by the GPU in all of them I think as is image processing but but none of that happens until you have decompressed video. This is from the Resolve system configuration guide

However for editing and grading, the compressed data needs to be decompressed to the full RGB per pixel bit depth that will use four times more processing power of a HD image for the same real time grading performance. The decompression process, like compression, uses the CPU so the heavily compressed codecs need more powerful and a greater number of CPU cores.

Once the files are decompressed, as DaVinci Resolve uses the GPU for all image processing, and always at the full color and bit depth, the number of GPU cores and the size of GPU RAM becomes a very important factor when dealing with UHD and 4K-DCI sources and timelines.

Yep, dedicated GPUs run screaming circles around Intel's GPUs. GPUs are something Intel's struggled with for years. There's even a possibility that HardOCP believes is really in the works that we'll be seeing Intel processors incorporating AMD GPUs in multi-chip modules.

Some applications are able to use multiple GPUs, but they're using OpenCL or CUDA rather than OpenGL.