I'm trying to figure the time delay between the actual "snapshot" (i.e. the precise moment of exposure) of synced V100R2 cams, and the time I'm actually processing the marker data in my application.
I am successfully reading timestamps of individual frames (CameraLibrary::Frame). To work out how long it's been since the image was taken, I'd need somehow synced/coordinated "global time" on my computer and on the cam.
One way to do this would be to be able to reset the camera's (or camera libraries') internal clock. I'd call this reset function and immediately reset my own internal time counter. Whenever I then process image data, I check my own time, subtract the frame's timestamp, and know how long it's been since the moment of exposure.
(I need this to work out some trajectory, or to at least guess where my tracked object is now (as opposed to the capture time)).
Anyway, so here are my questions:
1. Can I reset a camera's internal time counter / its timestamp, and does it have an immediate effect?
2. Right now, I'm working with CameraLibrary::FrameGroup objects, but I'm actually using timestamps of the CameraLibrary::Frame objects (which I get via FrameGroup->GetFrame(i)). Should I change this and use the FrameGroup::TimeStamp() method? Does it make a difference?
3. What doe FrameGroup::TimeSpread() and SetTimeSpread() do (found in modulesync.h)?
4. What is FrameGroup::TimeSpreadDeviation(int index)?
5. What do EarliestTimeStamp() and SetEarliestTimeStamp() do?
1 post • Page 1 of 1