No. All good. We want to create short piano like concerto that has to be synced to 2min music. Will then the audio in unity be in sync with spine animation? I mean if I take that same audio and play it in unity via audiosource? Do I need to do some extra unity syncing? Or updating skeletonanimation by audio.time?

If you update your animation using wall time (frame to frame time) it may not sync perfectly with the audio. This is mostly only noticeable for longer audio, which is how I'd classify 2 minutes of audio. In that case it's best to update the animation using the audio time, which will keep it perfectly in sync. If you only had short sound effects you probably wouldn't care to bother.

Sorry, we are behind where we'd like to be. The holidays robbed some time, but mostly other things have come up (urgent bugs, support, some very annoying website needs, minor 3.7 features). We've nearly cleared our plates of that junk, leaving a few important 3.6 bugs that need addressing and to finish off audio support. I have plans to see family (I live across the pond) for 2 weeks starting in 4 weeks, which unfortunately could mean another delay. I'd love to say 3.7 can be done before that time and that is what we are shooting for, but I can't say for sure. Worst case it should be done early March.

Do you have a dependency on 3.7? Is there something specific blocking you from using it as is? Remember the audio doesn't have a runtime component -- the audio events are just normal events that application code needs to use to play audio. Syncing is also done by application code updating the animations using the audio clock.

Hi Nate, it has been quite a while. How is the state of 3.7? Back to your question. Yes we have animated episodes planned for production with spine. ( like Peppa Pig or Ben and Holly ) Without sound we would be animating a bit like in void.

I have been researching a lot into Unity's audio latency (e.g. https://github.com/5argon/UnityiOSNativeAudio) no matter what you do an audio will be played a bit after you execute the code line that plays them. It might cause some misunderstanding that Spine is not accurate. It is a lot worse on iOS compared to native iOS app (but still faster than Android), on Android it is a bit late than native Android app.

A naive solution is by playing the audio early to compensate. It can't be helped in a non-deterministic situation like when you require player input at that instance to decide to play the sound (like pushing buttons, hitting drums) but in case of Spine we can know in advance that the audio is going to be played after we started the animation. (That is if we don't stop the animation) This problem is then solvable by thinking as if the audio event is a bit early on the timeline. That is to say it would be great if I have some way to do this in the API side in Unity. Might make the API ugly and hard to understand why it is there (and it is not even Spine's fault)... but will be better than moving the event back manually in every animation in Spine every time I want to build to Android.

If the event is near the start and can't be moved back we might move them to the earliest 0 time, or delay the animation itself so that sound playing can come earlier.

This is not something specific to Unity, Android, or iOS. It's not easy for any software to know precisely when audio or video is rendered, especially when the hardware is unknown. Games where it matters allow the audio and video latency to be specified, eg Rock Band. Video is done by having you press a button when you see something on screen:Audio is done by a mic in the guitar hardware:The game measures the time from when the audio data is submitted for rendering and when the mic picks it up. Rock Band doesn't know what kind of audio hardware you will use (laptop speakers, a home theater receiver, etc), but it does know the exact mic's specifications. Since you are unlikely to be selling your users a mic, you probably just need to make the synchronization configurable. You could allow a number to be specified in milliseconds and/or make a game of it, similar to the video pic above. Eg, play a metronome sound and have the user tap the screen when they hear it.

Nate wrote:3.7.15-beta has audio in video exports. We should release it in a few days.

This is great to hear, I just updated my Spine to 3.7.15 as this is pretty much the feature I've been waiting for. How is this done in the export? I don't see any settings and the resulting video doesn't play at all for me.

It's true, but we've unfortunately been caught up in a myriad of other things. I can assure you our days are full of work, and then some! The beta is available for use now. It is not necessary to wait for the official 3.7 release.

Nate wrote:It's true, but we've unfortunately been caught up in a myriad of other things. I can assure you our days are full of work, and then some! The beta is available for use now. It is not necessary to wait for the official 3.7 release.