I need to develop an application that receives camera metadata over SDI playing into a BM Decklink card.

Downloaded the SDK. There's a VancOutput demo application. is this putting out Vanc over SDI or receiving it? I looked through the source code and with my limited abilitites i think i realized that it only is good for output TO the camera. I would need the exact opposite...

The application i try to realize is a plugin for Unreal Engine that recives the camera metadata like focus, iris and zoom and can stream it to the engine's internal camera.

Don't forget that most cameras that have detachable lenses do not even have all that information present of the lens.

With broadcast these lenses normally do not comunicate this back to the camera. And with photolenses some info is given to the camera. But photolenses often do not support zoom control.

Hanc (Horizontal Ancilary data) & Vanc (Vertical Ancilary data) is data embedded into the sdi information.This data can be Timecode, Record flags, Embedded Audio, Closed caption, DVITC, Dolby Meta data, payload id's, ETC , And BMD cleverly uses a bit of Anc data to move control data from the Atem to the BMD Camera's. (which is not done by any other manufacturer. )https://en.wikipedia.org/wiki/Ancillary_data

But there is no return data to the Atem. So it is not use-able for your case. The Eva-1 also does not have a lot of information that is embedded into there ANC data.

So while your idea is great.. There is a reason those virtual studio setups with AR cost so much.. As they add sensors to the camera and put in very expensive lenses that have 12 bit reading on there focus and zoom. This is then all processed by a controller that changes that to a position in time and space.

It can be done cheap, but it can't be done cheap and well. Most cameras/lenses simply do not collect sufficient data to do AR, so you would have to use a mechanical encoder to record FIZ data. You also have to track the position of the camera, which requires more sensors and another data stream to process and feed into the engine in a usable format. Not to mention the fact that each lens is unique, and your software must calibrate to match it exactly. I've been to multiple demos from people who have done all that work and made an Unreal plugin, and the results are quite good.