Weird frame buffer data (TT_CameraFrameBuffer)

Post Reply
iko79
Posts: 11
Joined: Fri Apr 02, 2010 5:31 am

Weird frame buffer data (TT_CameraFrameBuffer)

Post by iko79 »

Hi,

I'm using the Tracking Tools API 2.1 and three FLEX:V100R2 cameras. I didn't yet place the cameras in the suggested distance from each other, nor did I calibrate the setup -- for starters I simply try to create three openGL rendering devices showing the captured images from each camera respectively by reading the framebuffer of each camera and downloading it to an openGL texture -- somewhat like in the Tracking Toolkit Application, Camera Investigation mode.

But I am not sure how to use TT_CameraFrameBuffer. So far I have an array of 640x480 bytes and I pass a pointer to it to the TT_CameraFrameBuffer function to fill it up -- simple enough.

Code: Select all

//set video mode to greyscale
if( !TT_SetCameraSettings( this->index, 1, 0, 150, 15 ) )
	throw std::runtime_error( "setting TrackingTools camera settings failed!" );

// ...

if( !TT_CameraFrameBuffer( this->index, 640, 480, 0, 8, data ) )
{
	std::cerr << "[WARNING] Flex:V100R2 rasterization failed" << std::endl;
	return false;
}
If I use the data array to update my openGL texture I get some confusing results. After starting the application I see this:
Image

This may be because of the latency caused by TT_SetCameraSettings (?). The dimensions seem to be okay so far (although I don't know why I only see a fraction of the framebuffer). After two seconds I get this:
Image

TT_CameraFrameBuffer seems to only fill up the upper left corner (320x240px) of the array. Even a few seconds later (I cannot clearly reproduce this however) I sometimes get this:
Image

Here an area of 160x120 gets filled up. My openGL code is working, the texture and quad dimensions are alright, I double checked the data array -- only the upper left corner is touched by TT_CameraFrameBuffer.

What is wrong here?

I also saw in the Tracking Tools Application ("Camera Investigation") that there are three quality modes available for grayscale mode. But how do I set these modes using the API?

Thanks a lot!

--iko79.
beckdo
Posts: 520
Joined: Tue Jan 02, 2007 2:02 pm

Re: Weird frame buffer data (TT_CameraFrameBuffer)

Post by beckdo »

Nice job putting this together. I can tell you exactly what's happening here. Unfortunately when you try to run multiple video streams in raw grayscale mode the USB bus can become overloaded with large packets of uncompressed data (or even a single raw stream depending on what other USB devices you have connected and/or the performance of your machine). The Tracking Tools API is automatically trying to 'decimate' the image by down sampling the raw grayscale spatially so that each frame is smaller.

I highly recommend running video mode 6 (MJPEG) instead of video mode 1 (Raw Grayscale). I think if you try this you'll start getting normal full rate, full frame video.
iko79
Posts: 11
Joined: Fri Apr 02, 2010 5:31 am

Re: Weird frame buffer data (TT_CameraFrameBuffer)

Post by iko79 »

Thank you for your reply! I am somewhat put out of action right now since I have trouble with my TT license, but I remember that I tried all modes on saturday and all of them acted the same.


Okay, now that my license problem is solved I was able to double check this -- you were right, this only happens in grayscale mode.


Since I cannot do anything on my project now, I would like to use the time to ask several questions since there are many things I could not deduce from the documentation. Unfortunately I am in a hurry with my project so I really would have to understand what I'm doing, there is little time for experimentation. An extensive documentation would be very valuable...

1) Is there no way to reduce the capture-framerate in code, so this automatic down-scaling does not happen?

2) When down-scaling is performed due to insufficient USB-bandwith, does this also affect the marker tracking on the camera? (As far as I understood, the camera does 2D-tracking of blobs on hardware and the 3D-tracking happens in software, right?)

3) Is there a way of getting notification if down-scaling is performed so I can react to it by just upscaling my openGL texture?

4) In the Tracking Tools application you deliver with the SDK, this does not seem to be an issue, so why does this happen in my software? (I am using OptiHub, if this is of interest here.) In yours I can see the frames from all three cameras in full size (capturing every frame), in every available video type. Except for "Segment Mode" where I only can see the topmost tenth to third (maximum), somewhat like in the first screenshot I posted above. Why is that?

5) Is the numbering of the videomodes in the Tracking Tools 2.0 manual still up to date? (0=Segment, 1=Grayscale, 2=Object, 4=Precision, 6=MJPEG) In camera investigation mode of the Tracking Tools application I noticed greyscale modes with high, medium and low quality. I could not read anything about this in the documentation. Also, there is a "Lens Focus Mode". What's that?

6) What do the different video modes mean anyway? Does this setting affect the tracking process on camera as well or does it just specify the video information I gain by TT_CameraFrameBuffer()? What is a Segment Mode, a Precision Mode or an Object Mode exactly? 5.2.6.11 TT_CameraFrameBuffer() says that "the resulting Image depends on what video mode the camera is in. If the camera is in grayscale mode, for example, a grayscale image is returned from this call." Well, a grayscale image is returned in all cases if I take a look at Tracking Tools camera investigation. I think, I could take a guess what Object Mode might be, although I'm not sure if I'm right, but what is the difference of Segement Mode, Precision Mode and Grayscale Mode?

7) 5.2.6.11 TT_CameraFrameBuffer() also says that the width and height parameters specify the dimensions of the framebuffer for rasterization which suggests that a resampling step is done (anyway I would interpret this like that). So if I pass a 320x240 byte buffer and set the parameters width and height to 320 and 240 respectively I would expect the whole (downsized) framebuffer content in my 320x240 px image. However, it seems like I get the upper left quater cutout of the whole 640x480 framebuffer and there is NO resampling. Did I misconceive the documentation or is there a problem elsewhere?

8) All these things have to be documented somewhere, don't they? After all I am not considering, e.g., the Video Modes as common knowledge. Is there a documentation available which gives answers to questions of that kind? Unfortunately I could not find it. Maybe I am missing an important document?

Thanks for your help and kind regards,
--iko79
beckdo
Posts: 520
Joined: Tue Jan 02, 2007 2:02 pm

Re: Weird frame buffer data (TT_CameraFrameBuffer)

Post by beckdo »

1) Is there no way to reduce the capture-framerate in code, so this automatic down-scaling does not happen?

You can reduce the camera frame rate, but full frame grayscale is still problematic.

2) When down-scaling is performed due to insufficient USB-bandwith, does this also affect the marker tracking on the camera? (As far as I understood, the camera does 2D-tracking of blobs on hardware and the 3D-tracking happens in software, right?)

Since the down-scaling only occurs in grayscale, 2D marker data is not affected.

3) Is there a way of getting notification if down-scaling is performed so I can react to it by just upscaling my openGL texture?

I added TT_CameraGrayscaleDecimation() to allow you to probe for the level of downscaling. 0=640x480, 1=320x240, 2=160x120

4) In the Tracking Tools application you deliver with the SDK, this does not seem to be an issue, so why does this happen in my software? (I am using OptiHub, if this is of interest here.) In yours I can see the frames from all three cameras in full size (capturing every frame), in every available video type. Except for "Segment Mode" where I only can see the topmost tenth to third (maximum), somewhat like in the first screenshot I posted above. Why is that?

This happens in the softwre as well, but the Tracking Tools automatically scales, like you were planning on implementing.

For Precision Mode there is a packet cap to ensure output from one camera does not affect tracking of other cameras. This has been pinned out into the Camera Group Settings Pane under Advanced. You will see Precision Cap which defaults to ~20K. Set it to 400,000 and hold your hand over the lens to confirm there is no cap.

5) Is the numbering of the videomodes in the Tracking Tools 2.0 manual still up to date? (0=Segment, 1=Grayscale, 2=Object, 4=Precision, 6=MJPEG) In camera investigation mode of the Tracking Tools application I noticed greyscale modes with high, medium and low quality. I could not read anything about this in the documentation. Also, there is a "Lens Focus Mode". What's that?

I added the enumeration to the Tracking Tools API. They are current. Please reference the OptiTrack SDK for more information on Video Modes.

6) What do the different video modes mean anyway? Does this setting affect the tracking process on camera as well or does it just specify the video information I gain by TT_CameraFrameBuffer()? What is a Segment Mode, a Precision Mode or an Object Mode exactly? 5.2.6.11 TT_CameraFrameBuffer() says that "the resulting Image depends on what video mode the camera is in. If the camera is in grayscale mode, for example, a grayscale image is returned from this call." Well, a grayscale image is returned in all cases if I take a look at Tracking Tools camera investigation. I think, I could take a guess what Object Mode might be, although I'm not sure if I'm right, but what is the difference of Segement Mode, Precision Mode and Grayscale Mode?

Please check the OptiTrack SDK for more info. From there I can answer any more specific questions you have.

7) 5.2.6.11 TT_CameraFrameBuffer() also says that the width and height parameters specify the dimensions of the framebuffer for rasterization which suggests that a resampling step is done (anyway I would interpret this like that). So if I pass a 320x240 byte buffer and set the parameters width and height to 320 and 240 respectively I would expect the whole (downsized) framebuffer content in my 320x240 px image. However, it seems like I get the upper left quater cutout of the whole 640x480 framebuffer and there is NO resampling. Did I misconceive the documentation or is there a problem elsewhere?

The Tracking Tools API will raster into the buffer provided in the format specified. Passing in the size is just to allow the API to validate that a buffer of sufficient size has been submitted. If the buffer is not large enough (say 320x240), then the API will automatically only raster the upper left without stomping memory outside the buffer.

8) All these things have to be documented somewhere, don't they? After all I am not considering, e.g., the Video Modes as common knowledge. Is there a documentation available which gives answers to questions of that kind? Unfortunately I could not find it. Maybe I am missing an important document?

Please reference the OptiTrack SDK for more general camera information. Also in the Tracking Tools you can check Help->Topic Index.
Post Reply