Page 1 of 4

Camera synchronization protocol

Posted: Tue Dec 18, 2007 7:26 am
by hpcv
How does the synchronization mechanism work? Is it possible to "take over" the clock from the "master" camera and supply our own synchronization pulse? What does such a pulse look like?

I'm asking this because we're planning to use the OptiTrack system with (at least) six V100's in a CAVE, where the active shutter glasses are also controlled through infrared flashes. The cameras' flashes will probably interfere with this. We hope to work out a synchronization mechanism in which the cameras will flash while the shutter glasses have a "dead" moment, right after a pulse.

Re: Camera synchronization protocol

Posted: Tue Dec 18, 2007 5:36 pm
by Birch
This is a question that has been coming up lately, so its a good opportunity to cover some of the details involved. We'd like to work on this to find a solution, so if you are able to share the results of your testing that would be great

The camera sync signal should only be generated by the (master) cameras, feeding them a signal generated by another source will probably not work for C120s and will definitely not work for V100s. It is possible to tap the sync-out of the last camera in the chain if you want to try and utilize it.

We have heard from customers that the cameras do not interfere with shutter glasses if the cameras are not operating in strobed mode, but not running the V100s in strobed mode degrades their tracking performance.

V100 cameras
VSYNC is active low, the duration is about 100 microseconds. After the VSYNC period there are a couple of additional transitions and then the sync signal will remain high until the next VSYNC.

In strobe mode with the default exposure value of 55, the IR illumination starts at the end of VSYNC and remains on for 55 scanlines (about 1 millisecond). Reducing the exposure below 55 will reduce the strobe duration accordingly. Increasing the exposure above 55 will increase the strobe duration until around 70 lines at which point the strobe will not to have any power left to discharge until the next frame.

In constant illumination mode the IR illumination will only be off during the VSYNC period.

V100 Multiple frames showing multiple VSYNCs

V100 VSYNC detail with additional transitions

C120 cameras
VSYNC is active low, the duration is about 65 microseconds.

The IR illumination will only be off during the VSYNC period.

C120 Multiple frames showing multiple VSYNCs

C120 VSYNC detail

Re: Camera synchronization protocol

Posted: Tue Dec 18, 2007 6:36 pm
by leith
While we're on the subject, you'll eventually need a solution for timecode input as well. timecode isn't usually used for sync and doesn't solve that problem but probably needs to be as close to the hardware as possible to avoid delays and error.

seems to me, you may be looking at an external box that acts as the sync master to the cameras, takes in various sync signals that it can lock to (standard NTSC black burst will need to be one), and also takes in timecode that it stamps with the frame IDs and passes to the drivers.

this is the general solution you'll find on vicon systems and others. I'd love to come up with a better solution. I'm just stumped as to what it would be.

Re: Camera synchronization protocol

Posted: Fri Dec 21, 2007 5:37 pm
by Jim

We probably don't know enough about what is expected of the product here, so feel free to come up with ideas and specifics and we will work on it for sure.

Re: Camera synchronization protocol

Posted: Fri Dec 21, 2007 8:29 pm
by leith
sync is a big deal and slightly complicated by the v100's framerate. I'll see if I can collect some resources to reference before I tackle it.

timecode is simpler.

more specifically:

SMPTE is the most widely used and most standard.

basically, what a mocap system likely needs to do with timecode, is be able to take a timecode feed, and stamp it to the frames coming out of the mocap system so that the frames can later be synced to other medias that also stamped the timecode.

usually, you have a house sync generator, that's generating a standard sync signal for all electronics. You also usually have a house timecode generator that is generating timecode for all recording devices to record in sync with their media.

such devices can include but are not limited to:

video recording decks (standard def and hi def)
digital film decks
dv video
window dubbed video (timecode is part of the picture)
DAT or other audio recording decks and devices
motion control device recorders

the end result, is that when all this media needs to be reassembled and synchronized, it can be.

A very simple but very practical setup, would be a timecode generator that is feeding both the mocap system and a window dub device that is feeding into a tape deck. the window dub device will likely display a split screen of an on set camcorder, and the output from the mocap system. the window dub device will also burn-in the timcode at the bottom of the frame for reference. the TCGen is set to free run at the start of the day. Any time someone calls "roll", both the mocap system and the video deck are rolled.

The client takes the videotape back with them so they can see the mocap performance, and resulting capture. They can then request certain ranges be cleaned up and delivered, by referencing the timecode they see on the videotape. That timecode can live through the entire animation and editing process and the exact frames of mocap in use in any frame of a well run post production process can be easily ascertained.

timecode has generally replaced or augmented the clapboard in most scenarios

though its probably worth noting that I do still own a clapboard. and I have in fact affixed mocap markers to it so I can record audio in sync with mocap. This is not an ideal solution however :)

what you'd typically want, is a timecode input to the mocap system (usually in the form of an XLR or Phone jack. LTC). You'd want that timecode to be stamped to every frame that comes in.

beyond what optitrack should do, in an application, you'd at the very least want to be able to see the stamped timecode in realtime, as well as during editing.

You'd probably want some features for organizing your takes via timecode. You'd probably also want to be able to export the frames from a specific range. Typical mocap and animation formats DO NOT have a concept of timecode. So typically, you'll export a take by its slate name and then provide a .txt file indicating that the first frame is at timecode (xx:xx:xx;xx). Or, the filename you'd export would use a naming convention to indicate the TC of the first frame (ShootDay1_xx_xx_xx-xx.c3d).

Note: it would be very important to not forget to indicated drop vs. non-drop in such a convention.

you'd probably also want the ability to trigger a recording (and/or other software functions) from incoming timecode values. a mocap system as part of a live performance would benefit from this capability. Many complex live performances run off of timecode.

anyhow, that's a simple primer. I can get more in depth if you have any specific questions.

Re: Camera synchronization protocol

Posted: Sun Dec 23, 2007 6:04 pm
by Jim

Great reply, as usual. Lots of good info to track down for me. I think we can work in timecode in the manner you suggest. We will need some sort of hardware interface for it, maybe into a camera, that would be nice. It is on the official list now! :)

Re: Camera synchronization protocol

Posted: Wed Oct 01, 2008 1:14 am
by florian

I also have the same question: I want to combine Optitrack V100 system with Stereographics Shutter glasses. I checked it yesterday and the glasses don't work in the strobed mode. In the continous mode they work with illumination values below 7.
That means I actually lose a lot of the tracking range if I only can work with illumination 7, am I right ?

I was also thinking of trying to pulse the IR-LEDs in a "dead" moment right after a shutter-glasses pulse. Has somebody tried this ? Are there any other solutions ?


Re: Camera synchronization protocol

Posted: Fri Nov 07, 2008 10:08 am
by jerome.ardouin

I need to find a way to synchronize IR shutterglasses with camera strobbe to.
On ART tracking system, this is done by synchonizing camera strobes/shutters on external source, ie the shutter glass signal, or the genlock signal if you are using nVidia GSync board.
On the second post Birch state that we definitively can't synchro V100 cams on ext signal. when we look at the patern in real time on a scope, the "couple of additionnal transitions" seem to be the timecode, 8bits, counting from 0 to 255. the rest of the period is not varying from frame to frame. Thus, if we are able to gen a signal like this, including timecode, would we be able to sync the camera array to my external signal or is there other reason that make this impossible?

If the cams detect impedance on its sync pin to toggle from slave to master, I don't see anything that avoid to syncho on such an eternal source transparently. I the master cam is toggled from slave to master by software command, this "external" mode should be enabled by sofware in RBToolkit and ARENA

If someone can give me more info on this topic, I can try to build a box that gen V100 compatible timecode for start, then i can add something to lock these timecode on external source like shutterglasse signal

Thanks for your answers

Re: Camera synchronization protocol

Posted: Fri Nov 07, 2008 5:40 pm
by VincentG
We are actively looking into this and hope to have camera sync input/output improvements available for testing soon.

Re: Camera synchronization protocol

Posted: Sat Nov 08, 2008 6:13 pm
by Jim

A bit more info:

We are happy to disclose how the V100 camera sync mechanism works, and we are writing a PDF on it now.

The sync is a 3.3V signal, through a buffer chip, to the FPGA.

The signal you are looking at is a 100Hz pulse for camera Sync, followed by 8 bits that are the Frame ID# (which rolls over) and another couple of bits for the Camera ID#, so they know how to automatically label themselves.

We even have a mechanism to trigger sync from a UART, if you have a level converter to logic level. But, you need the spec from us on what commands to send.

Finally, we are re-writing the sync code now to make it much more useful for shutter-glasses type applications. We can now sync to external sources, with just an edge (you can select polarity), so if you can feed us a 3.3V or 5V logic level signal, we can sync to it. The cameras will basically be operating in a Triggered mode at that point, the edge will be the trigger.

Our goal is to support this first in the new release of the Tracking Tools 2.0. That will be out before the end of November.

For today, yes, you can run your cameras in LED Mode 7, which is continuous illumination. To get more range, you will need to increase your Exposure time, but that lets in more background light.

After we get this new support out, we are very open to feedback to make sure it works as you expect. Please PM me if you use the Tracking Tools and would like to be in on the testing of the new sync code.