I have a concert recording with 2 cameras and 1 sound recorder. The sound is a single, 30 minutes long file. However both cameras took multiple shots from different angels, so there are around 20-30 video clips for each camera. (obviously, there are unrecorded seconds between clips, as the camera moved and cut recording).

I want to sync all the recordings in the timeline using waveform. However, Resolve's sync does not perform a usable result: It adds parts of the audio under different clips, so I end up having multiple number of separate clips - each with independently synced audio - but not synced among each other in the timeline.

Previously, I was using Pluraleyes and Premiere Pro. What Pluraleyes does is that it puts the audio throughout a timeline, and adds video recordings synced throughout this audio in 1 timeline. You also have different channels of video synced for different cameras. As a result it is very easy to have a single timeline of 30 minutes of concert video with multicam feature available. (Unfortunately, I also couldn't manage to export Pluraleyes sync to Resolve)

How can I achieve a similar result in Resolve (like Pluraleyes), where I can sync the 2 cameras (in 2 video channels) and the audio under the same timeline?

I think creating 2 video layers and 1 audio layer, then manually syncing your picture on top of the audio is the only way to do this somewhat easily. Conversely, use the method you said worked for you before using Plural Eyes. Either way, unless you have a hundred video clips, I can't see that taking more than 15 minutes if you've already got guide track audio embedded on your video already. It is called editing, after all

Can Adil wrote:I have a concert recording with 2 cameras and 1 sound recorder. The sound is a single, 30 minutes long file. However both cameras took multiple shots from different angels, so there are around 20-30 video clips for each camera. (obviously, there are unrecorded seconds between clips, as the camera moved and cut recording).

I want to sync all the recordings in the timeline using waveform. However, Resolve's sync does not perform a usable result: It adds parts of the audio under different clips, so I end up having multiple number of separate clips - each with independently synced audio - but not synced among each other in the timeline.

Previously, I was using Pluraleyes and Premiere Pro. What Pluraleyes does is that it puts the audio throughout a timeline, and adds video recordings synced throughout this audio in 1 timeline. You also have different channels of video synced for different cameras. As a result it is very easy to have a single timeline of 30 minutes of concert video with multicam feature available. (Unfortunately, I also couldn't manage to export Pluraleyes sync to Resolve)

How can I achieve a similar result in Resolve (like Pluraleyes), where I can sync the 2 cameras (in 2 video channels) and the audio under the same timeline?

If you had sync'ed timecode on the cameras and the audio, this would be pretty easy. In v12.5 we added support for this type of a situation.

That would work assuming the timecode on all the camera's and sound device are perfectly synced, which is hardly ever the case. Unless you have hard sync points or where able to slate clap each picture file, you'd have no way of knowing how far off each devices timecode has drifted apart. Slates are still used on most productions for this very reason.

Chris Blacklock wrote:That would work assuming the timecode on all the camera's and sound device are perfectly synced, which is hardly ever the case. Unless you have hard sync points or where able to slate clap each picture file, you'd have no way of knowing how far off each devices timecode has drifted apart. Slates are still used on most productions for this very reason.

I have to say, though, I've worked on many multi-camera concert videos where they used no slates at all and we just had to sync it all by eye. Once you've done it a few hundred times, it becomes second nature and you can do it pretty quickly.

If it's a scripted situation under very controlled conditions, then a slate is an absolute must. I get that in a live concert, it may not be convenient for a person to run up on stage in the middle of a song and clap a slate. We have used tricks like running a giant timecode slate off to one side for cameras to shoot when they have to change mags or drives or whatever, but it's up to the camera operator to remember to shoot them.

I worked as an assist for reality TV and spent hours syncing for weeks. A year or two ago I found plural eyes, Resolve syncing does not come close to this, complex mulitcam recordings without timecode are sync in seconds. You can then export to your NLE but not yet resolve. If you spend more than 15 minutes syncing regularly then plural eyes is worth buying.

Chris Blacklock wrote:That would work assuming the timecode on all the camera's and sound device are perfectly synced, which is hardly ever the case. Unless you have hard sync points or where able to slate clap each picture file, you'd have no way of knowing how far off each devices timecode has drifted apart. Slates are still used on most productions for this very reason.

I have to say, though, I've worked on many multi-camera concert videos where they used no slates at all and we just had to sync it all by eye. Once you've done it a few hundred times, it becomes second nature and you can do it pretty quickly.

If it's a scripted situation under very controlled conditions, then a slate is an absolute must. I get that in a live concert, it may not be convenient for a person to run up on stage in the middle of a song and clap a slate. We have used tricks like running a giant timecode slate off to one side for cameras to shoot when they have to change mags or drives or whatever, but it's up to the camera operator to remember to shoot them.

All very true Marc! I recall getting a truck full of film and audio for a foreign feature dailies with no reports and no slates! I got pretty good at manual syncing too! It was not that much fun though and I think there are many tried and true methods to achieve sync without resorting to this...unless you enjoy that pain!

I have the same situation.The search function did not turn up anything helpful and nobody here was able to help Can Adil either.So as Resolve claims to support Multicam Editing, this must work somehow and I need to know how.Replies how to do the recording differently don't help at all. The recording is already done and this is a very common situation when not working on a film set.

Situation:1 long audio recording (3 files with no gap, grouped in a compound clip to fixate their relative position)7 cameras (+ extras coming in later), matching time about +-1 minuteserveral clips per camera (3 acts, 4 scenes per act, not recording intermissions except for audio and backstage cam)no clapperboard (it's a stage with an audience)no synced cameras (no cables and half the cameras don't support it anyway)

Problems:* Resolve 12.5 either crashes when syncing 1 compound audio clip with even just 1 clip from 1 camera.* or put everything into it's own track starting at the same time when syncing against the 3 sound files.* In "detect clips from same camera using: Metadata Camera #" it ignores "Camera ID"* Even with a Blackmagic Production Camera 4K it imported the Camera name in RAW clips but not in ProRes clips from the same camera. From Atomos Recorders and GH4 cameras it didn't import any metadata at all.

Questions:* How do I define the audio track as the master to sync with?* How to at least roughly position added clips using the file date?* How do I force a clip inside a multicam clip to be synced again (after manually placing it roughly where it belongs)?* How do I add an additional camera to a multicam clip with syncing using sound? * (How do I combine clips from different bins into a new multicam clip?)

Questions for doing LOTS of manual syncing (78-100 clips):* Somehow dragging a clip with audio and video into the audio tracks(because that's where you are doing the syncing) does not add the video track at the opposite end* Editing a multicam clip does not display all camera angles but just 1 even if it's a multicam clip I'm editing.* how do you get the synced multicam clip into serveral, different projects (1 project per scene to be edited separately, in parallel, off-site using proxies. To be combined for final color grading. Resolve 12 was supposed to allow that using smart bins but I can't figure it out.)* Is there a way to scale the displayed waveforms for manual syncing without changing the levels?* Is there a way to display a line or similar between individual frames when zooming in far enough as an indicator how small an amount you can drag and where it will snap in place?* How do you drag a clip very far outside the displayed part of the timeline? Dragging seems to only react to movement and there is only a very small area left and right where such movement will scroll the timeline. Holding still to wait until it has scrolled enough (like in other MacOS software) doesn't seem to work.* For some reason double clicking a compound clip or multicam clip doesn't seem to open it in the editor tab or do anything at all.