Studio Setup

Revision as of 11:22, 16 November 2022 by Marks.polakovs (talk | contribs) (Created page with "= Vision/Audio Path = SDI patches 91-94 are routed from the studio into cameras 1-4 on the ATEM Television Studio 4K in the control room. : The wire for camera 4 in the loom...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Vision/Audio Path

SDI patches 91-94 are routed from the studio into cameras 1-4 on the ATEM Television Studio 4K in the control room.

The wire for camera 4 in the loom is dodgy and needs replacing, for now it's hard-patched. Marks.polakovs (talk) 11:21, 16 November 2022 (UTC)

The ATEM's program out is patched into SDI 0 on edit2's DeckLink for broadcast (usually through OBS).

Audio comes in through the XLR patches into the X32 Compact, and its program out is wired into the ATEM's XLR inputs.

VT Playout

VTs are played using CasparCG on edit2. Caspar is configured with two layers: layer 1 outputs onto SDI 1 on the DeckLink, which is patched into "camera" 8 on the ATEM. Layer 1 also outputs audio onto edit2's default audio device (usually the headphone interface, which is patched into the X32 Compact). Layer 2, meanwhile, outputs over NDI - this is added in OBS as an overlay on all sources, usually for pre-rendered lower thirds and such.

With this setup there's a bit of delay on VT audio (very small - <100ms - but noticable) - my theory is that this is simply software audio latency (and also runs the risk of getting Windows system sounds unless we remember to mute them). This is, to quote Rhys, not ideal. Possible alternatives:
  • USB direct into the X32 - still has latency issues
  • Audio onto SDI direct into ATEM (set to audio-follows-video on Cam8) - means we can't mix VT audio on the X32, which is useful for levelling
  • Audio onto SDI, into a de-embedder, into X32 - spenny money (the Decimator can do it, but it's a bit overkill and a waste of a decimator)
Further tinkering is required. Marks.polakovs (talk) 11:21, 16 November 2022 (UTC)

TV Graphics

Currently it's been done using NDI: running NDI Screen Capture on one of the PCs (usually edit3) and Studio Monitor on one of the laptops, running into the TV - and also into OBS.

This is overkill, and requires three PCs. Beth suggested just using the ATEM's media players, aux'd into the TV. This is a good idea. Let's do that. Marks.polakovs (talk) 11:21, 16 November 2022 (UTC)

OnTime

OnTime is the system we use to keep time for live shows with specific times. It usually runs on edit1, and is displayed on the iMac on the control room monitor wall, and the Mac Mini rigged in the studio lighting rig.