Re: Camera synchronization protocol
Posted: Mon Nov 12, 2018 4:58 am
Thanks for the link.leith wrote: ↑Fri Dec 21, 2007 8:29 pm sync is a big deal and slightly complicated by the v100's framerate. I'll see if I can collect some resources to reference before I tackle it.
timecode is simpler.
http://en.wikipedia.org/wiki/Timecode
more specifically:
http://en.wikipedia.org/wiki/SMPTE_time_code
SMPTE is the most widely used and most standard.
basically, what a mocap system likely needs to do with timecode, is be able to take a timecode feed, and stamp it to the frames coming out of the mocap system so that the frames can later be synced to other medias that also stamped the timecode.
usually, you have a house sync generator, that's generating a standard sync signal for all electronics. You also usually have a house timecode generator that is generating timecode for all recording devices to record in sync with their media.
such devices can include but are not limited to:
video recording decks (standard def and hi def)
digital film decks
dv video
window dubbed video (timecode is part of the picture)
DAT or other audio recording decks and devices
motion control device recorders
the end result, is that when all this media needs to be reassembled and synchronized, it can be.
A very simple but very practical setup, would be a timecode generator that is feeding both the mocap system and a window dub device that is feeding into a tape deck. the window dub device will likely display a split screen of an on set camcorder, and the output from the mocap system. the window dub device will also burn-in the timcode at the bottom of the frame for reference. the TCGen is set to free run at the start of the day. Any time someone calls "roll", both the mocap system and the video deck are rolled.
The client takes the videotape back with them so they can see the mocap performance, and resulting capture. They can then request certain ranges be cleaned up and delivered, by referencing the timecode they see on the videotape. That timecode can live through the entire animation and editing process and the exact frames of mocap in use in any frame of a well run post production process can be easily ascertained.
timecode has generally replaced or augmented the clapboard in most scenarios
http://en.wikipedia.org/wiki/Clapperboard
though its probably worth noting that I do still own a clapboard. and I have in fact affixed mocap markers to it so I can record audio in sync with mocap. This is not an ideal solution however
what you'd typically want, is a timecode input to the mocap system (usually in the form of an XLR or Phone jack. LTC). You'd want that timecode to be stamped to every frame that comes in.
beyond what optitrack should do, in an application, you'd at the very least want to be able to see the stamped timecode in realtime, as well as during editing.
You'd probably want some features for organizing your takes via timecode. You'd probably also want to be able to export the frames from a specific range. Typical mocap and animation formats DO NOT have a concept of timecode. So typically, you'll export a take by its slate name and then provide a .txt file indicating that the first frame is at timecode (xx:xx:xx;xx). Or, the filename you'd export would use a naming convention to indicate the TC of the first frame (ShootDay1_xx_xx_xx-xx.c3d).
Note: it would be very important to not forget to indicated drop vs. non-drop in such a convention.
you'd probably also want the ability to trigger a recording (and/or other software functions) from incoming timecode values. a mocap system as part of a live performance would benefit from this capability. Many complex live performances run off of timecode.
anyhow, that's a simple primer. I can get more in depth if you have any specific questions.