Thanks for the link.leith wrote: ↑Fri Dec 21, 2007 8:29 pm sync is a big deal and slightly complicated by the v100's framerate. I'll see if I can collect some resources to reference before I tackle it.
timecode is simpler.
http://en.wikipedia.org/wiki/Timecode
more specifically:
http://en.wikipedia.org/wiki/SMPTE_time_code
SMPTE is the most widely used and most standard.
basically, what a mocap system likely needs to do with timecode, is be able to take a timecode feed, and stamp it to the frames coming out of the mocap system so that the frames can later be synced to other medias that also stamped the timecode.
usually, you have a house sync generator, that's generating a standard sync signal for all electronics. You also usually have a house timecode generator that is generating timecode for all recording devices to record in sync with their media.
such devices can include but are not limited to:
video recording decks (standard def and hi def)
digital film decks
dv video
window dubbed video (timecode is part of the picture)
DAT or other audio recording decks and devices
motion control device recorders
the end result, is that when all this media needs to be reassembled and synchronized, it can be.
A very simple but very practical setup, would be a timecode generator that is feeding both the mocap system and a window dub device that is feeding into a tape deck. the window dub device will likely display a split screen of an on set camcorder, and the output from the mocap system. the window dub device will also burn-in the timcode at the bottom of the frame for reference. the TCGen is set to free run at the start of the day. Any time someone calls "roll", both the mocap system and the video deck are rolled.
The client takes the videotape back with them so they can see the mocap performance, and resulting capture. They can then request certain ranges be cleaned up and delivered, by referencing the timecode they see on the videotape. That timecode can live through the entire animation and editing process and the exact frames of mocap in use in any frame of a well run post production process can be easily ascertained.
timecode has generally replaced or augmented the clapboard in most scenarios
http://en.wikipedia.org/wiki/Clapperboard
though its probably worth noting that I do still own a clapboard. and I have in fact affixed mocap markers to it so I can record audio in sync with mocap. This is not an ideal solution however
what you'd typically want, is a timecode input to the mocap system (usually in the form of an XLR or Phone jack. LTC). You'd want that timecode to be stamped to every frame that comes in.
beyond what optitrack should do, in an application, you'd at the very least want to be able to see the stamped timecode in realtime, as well as during editing.
You'd probably want some features for organizing your takes via timecode. You'd probably also want to be able to export the frames from a specific range. Typical mocap and animation formats DO NOT have a concept of timecode. So typically, you'll export a take by its slate name and then provide a .txt file indicating that the first frame is at timecode (xx:xx:xx;xx). Or, the filename you'd export would use a naming convention to indicate the TC of the first frame (ShootDay1_xx_xx_xx-xx.c3d).
Note: it would be very important to not forget to indicated drop vs. non-drop in such a convention.
you'd probably also want the ability to trigger a recording (and/or other software functions) from incoming timecode values. a mocap system as part of a live performance would benefit from this capability. Many complex live performances run off of timecode.
anyhow, that's a simple primer. I can get more in depth if you have any specific questions.
Camera synchronization protocol
-
- Posts: 1
- Joined: Mon Nov 12, 2018 4:57 am
Re: Camera synchronization protocol
Re: Camera synchronization protocol
The PTP Precision Time Protocol is used for synchronisation of several cameras in an Ethernet network. The Precision Time Protocol (PTP) IEEE 1588 enables the exact cycle synchronisation of several devices in an Ethernet system.
V100 camera sync mechanism works very well.I am reliable to it.It is beautiful with good looking attractive logo.Moreover the use of Tracking Tools will a be a good idea.
Regards:
Jes81,,,
V100 camera sync mechanism works very well.I am reliable to it.It is beautiful with good looking attractive logo.Moreover the use of Tracking Tools will a be a good idea.
Regards:
Jes81,,,
The Official website of Sri Lanka ETA and [url=https://eta-govt.com/] Online visa for Sri Lanka Visa Online [/url]. Fill the application form for Evisa SriLanka and get Visa online within 24 hours.
-
- Posts: 1
- Joined: Fri Jul 24, 2020 4:40 am
- Contact:
Re: Camera synchronization protocol
But can you access those groupings from an application developed with the API? For example, can you set up 8 different cameras, each with a slightly different timing (from within the Tracking Tools, and then run your own application with that saved setup and get data from each camera at the timing you specified previously?
https://eta-govt.com/sri-lanka-visa/
Re: Camera synchronization protocol
Thanks for the info,
it has some great features,
but it doesn't do quite what I'm looking for.
I have an application requiring XY detection of a high volume of laser spots on a projection screen.
In some cases the volume is so high that we have a number that occur in a single frame.
We have a theory where if we could sync two of the V100 SLIM cameras 90 degrees out of phase with each other (i.e a half frame separation in timing) that we can resolve some of our issues.
Is the sync timing for the cameras a simple clock pulse or is there data in the sync pulse?
How is the sync pulse activated? i.e. what determines which camera is the Master?
With the camera running I connected an oscilloscope to the sync out/in lines on the V100 and only saw a consistent hi voltage of 4.7V and 3.3V (don’t remember which respectively),
but I did not see anything that resembled a clock pulse.
Should I see some sort of signal on these lines with the camera running?
_______________________________
Valley All Star Moving Service
it has some great features,
but it doesn't do quite what I'm looking for.
I have an application requiring XY detection of a high volume of laser spots on a projection screen.
In some cases the volume is so high that we have a number that occur in a single frame.
We have a theory where if we could sync two of the V100 SLIM cameras 90 degrees out of phase with each other (i.e a half frame separation in timing) that we can resolve some of our issues.
Is the sync timing for the cameras a simple clock pulse or is there data in the sync pulse?
How is the sync pulse activated? i.e. what determines which camera is the Master?
With the camera running I connected an oscilloscope to the sync out/in lines on the V100 and only saw a consistent hi voltage of 4.7V and 3.3V (don’t remember which respectively),
but I did not see anything that resembled a clock pulse.
Should I see some sort of signal on these lines with the camera running?
_______________________________
Valley All Star Moving Service
-
- Posts: 1
- Joined: Mon Jan 10, 2022 8:53 am
Re: Camera synchronization protocol
The camera sync signal should only be generated by the (master) cameras, feeding them a signal generated by another source will probably not work for C120s and will definitely not work for V100s. It is possible to tap the sync-out of the last camera in the chain if you want to try and utilize it.
We have heard from customers that the cameras do not interfere with shutter glasses if the cameras are not operating in strobed mode, but not running the V100s in strobed mode degrades their tracking performance.
V100 cameras
VSYNC is active low, the duration is about 100 microseconds. After the VSYNC period there are a couple of additional transitions and then the sync signal will remain high until the next VSYNC.
In strobe mode with the default exposure value of 55, the IR illumination starts at the end of VSYNC and remains on for 55 scanlines (about 1 millisecond.Reducing the exposure below 55 will reduce the strobe duration accordingly. Increasing the exposure above 55 will increase the strobe duration until around 70 lines at which point the strobe will not to have any power left to discharge until the next frame.
We have heard from customers that the cameras do not interfere with shutter glasses if the cameras are not operating in strobed mode, but not running the V100s in strobed mode degrades their tracking performance.
V100 cameras
VSYNC is active low, the duration is about 100 microseconds. After the VSYNC period there are a couple of additional transitions and then the sync signal will remain high until the next VSYNC.
In strobe mode with the default exposure value of 55, the IR illumination starts at the end of VSYNC and remains on for 55 scanlines (about 1 millisecond.Reducing the exposure below 55 will reduce the strobe duration accordingly. Increasing the exposure above 55 will increase the strobe duration until around 70 lines at which point the strobe will not to have any power left to discharge until the next frame.
[url=https://thehellerapproach.com/]The Heller Approach[/url]
-
- Posts: 1
- Joined: Fri Dec 08, 2023 5:46 am
Re: Camera synchronization protocol
Fantastic to hear that the suggested approach aligns with your plans! Integrating timecode with a hardware interface, possibly into a camera, sounds like a promising step. This addition to the official list signifies a commitment to enhancing functionality and precision.