CUC – Camera-Unity Connection

#European Project Semester (EPS) #Media Technology (BA) #/Media & Digital Technologies #Student projects

CUC is a collaborative project of students from Russia, Austria, Germany and France who are participating in the European Project Semester (EPS).

The Device

With the help of our device, a virtual camera can be merged with the incoming video stream of a studio camera in such a way that the zoom of the real, physical camera is translated into Unity and therefore onto the virtual camera.

Our product is designed to fit perfectly on the Sony HXC-FB75H camera with a Canon KJ20x8.2b IRS lens which we have in the TV studio at the St. Pölten UAS.

The Development

A vital part of this project was splitting the development into three sub-groups. This was relevant not only for creating the device, but also for distributing the workload evenly among the team members. These three segments of the project need to work not only simultaneously, but also harmoniously.  They consist of the mechanical parts and 3D modelling, the programming and wiring of the microcontroller, and the correct integration in Unity with the incoming SDI signal.

The microcontroller is powered through POE for simplified use.

CUC_2.jpg

The Holy Trinity

The mechanical parts include the mounting of the encoder onto the lens, which then collects the data from the rings of the camera, and the mounting of the microcontroller onto the handle of the camera for easy access, as you need to use the buttons on top of the casing for calibrating the zoom for the virtual camera.

The programming part consists of the coding of the microcontroller to correctly read the encoder values and subsequently send the metadata to Unity via OSC (Open Sound Control), as well as to provide the user with helpful feedback when using the device (through LEDs and a piezo).

The virtual environment part is implemented solely in Unity, as the metadata sent by the microcontroller via OSC have to be mapped correctly. This is not as easy as it sounds, as the zoom rings on the camera do not work in a linear fashion but in an exponential one. However, the mapping needs to be as precise as possible so that the virtual camera can flawlessly imitate the zoom of the physical camera.