Computer performance affected with SDK

maikelnai
Posts: 21
Joined: Thu May 22, 2014 1:25 am

Computer performance affected with SDK

Post by maikelnai »

Dear all,

we have upgraded our SW to use the new SDK. Our system normally uses 8 cameras working in pairs and delayed 2,5 ms between groups. This is the routine for setup them:

Code: Select all

 /// Necesary to use the methods from the cameras
	//CameraLibrary_EnableDevelopment();
    CameraLibrary::CameraManager::X().WaitForInitialization();
	
    CameraLibrary::CameraList list;
	CameraLibrary::CameraManager::X().GetCameraList(list);
	LONG m_CamerasInSystem = (LONG)(list.Count());
    
    /// For each one of the cameras, generate ans INPCamera Object and store it into the vector used to store them
    int c;
    for(c=0;c<m_CamerasInSystem;c++){

		/// The Camera Object
		CameraLibrary::Camera *AuxCamera = CameraLibrary::CameraManager::X().GetCamera(list[c].UID());

        /// Read the camera information from the own camera
        if(AuxCamera){

			/// Write some information of the camera
  
			m_lSerial = (LONG)(AuxCamera->Serial());
            m_vCamerasSerial.push_back(m_lSerial);
            m_lImageHeight = (LONG)(AuxCamera->Height());
            m_lImageWidth = (LONG)(AuxCamera->Width());

			/// Create the Frame Object (cFrameObject heritages from CameraLibrary::cCameraListener)
			this->m_mTheFrameObjects[m_lSerial] =  new CFrameObject(AuxCamera);

			/// Set the default values for the correct operation of the cameras 
			//AuxCamera->SetVideoType(CameraLibrary::SegmentMode);
			AuxCamera->SetVideoType(Core::ObjectMode);
			AuxCamera->SetGrayscaleDecimation(0);
			AuxCamera->SetFrameDecimation(0);

			/// The Threshold
			AuxCamera->SetThreshold(this->AdquisitionConfig->m_iCameraThreslhold);

			/// The FrameRate
			AuxCamera->SetFrameRate(this->AdquisitionConfig->m_iCameraFrameRate);

			/// Get the scale for the exposure
			int  MinExposure = AuxCamera->MinimumExposureValue();
			int  MaxExposure = AuxCamera->MaximumExposureValue();
			m_ExposureRate = (float)((MaxExposure - MinExposure)*this->AdquisitionConfig->m_iCameraFrameRate/1000);

			/// The Exposure If there are delay between the group of cameras or Not
			if(this->AdquisitionConfig->m_iDelaySystem == 1) AuxCamera->SetExposure((int)(m_ExposureRate * this->AdquisitionConfig->m_fDelayExposure));
			else AuxCamera->SetExposure(this->AdquisitionConfig->m_iCameraExposure);

			/// Put the filter
			AuxCamera->SetIRFilter(true);

			/// Set the delay for each camera in case of delay system
			if(this->AdquisitionConfig->m_iDelaySystem){
				
				/// Get the delay in us
				int DelayGapInUs = 1000000 /( this->AdquisitionConfig->m_iCameraFrameRate * this->AdquisitionConfig->m_iDelayNumberOfGroups);
				
				/// Apply the delay
				std::map<LONG,int>::iterator iterDelayGroup = this->AdquisitionConfig->m_mDelayGroups.find(m_lSerial);
				if(iterDelayGroup != this->AdquisitionConfig->m_mDelayGroups.end()){
					DelayGapInUs = DelayGapInUs * ((*iterDelayGroup).second - 1);
					AuxCamera->SetShutterDelay(DelayGapInUs);
				}

			}

			/// Set not delay in other case
			else{
				AuxCamera->SetShutterDelay(0);
			}

			/// Switch off the leds
			if(this->AdquisitionConfig->m_iCameraLed == 0){
				AuxCamera->SetIntensity(0);
			}

			/// Generate the callback for the camera frames
			AuxCamera->AttachListener(this->m_mTheFrameObjects[m_lSerial]);
			AuxCamera->SendEmptyFrames(false);
			AuxCamera->SendInvalidFrames(false);

			/// Start the Camera
			AuxCamera->Start();

			/// Add the camera to the list
			m_listCamera.push_back(AuxCamera);

        }

 }
This is basically the same routine as used in the previous API (OptiTrack_1.3.037.B) but translated with the new functions.

The other change we have implemented is the way to obtain the new frames. In this new way we use the callback implemented through a CameraLibrary::cCameraListener. This Works fine and the method FrameAvailable() is just called when a new frame is available.

But the resources that our App uses now (time of processor basically) have been increased from 0-5% in the old version to 30-40% in the new version when no frames are been processed (a completely dark room). Just calling the Start method, from the CameraLibrary::Camera object the application starts to consume processor time while no calls to the FrameAvailable method are done.

In the old version, we had a thread for each camera calling GetFrame(TimeToWait,&(AuxFrame)).

Can be the cameras set to reduce the processor time consumption?
NaturalPoint-Dustin
Posts: 609
Joined: Tue Mar 19, 2013 5:03 pm

Re: Computer performance affected with SDK

Post by NaturalPoint-Dustin »

Hello,

I will research this and get back to you. Thank you for posting this.
Dustin
Technical Support Engineer
OptiTrack | TrackIR | SmartNav
beckdo
Posts: 520
Joined: Tue Jan 02, 2007 2:02 pm

Re: Computer performance affected with SDK

Post by beckdo »

Hi maikelnai,

This CPU utilization is unusually high. The Camera SDK is designed to be a very light layer and utilize the absolute minimum CPU in every possible configuration.

If you create a synchronizer, attach your cameras, and do nothing else, you should expect that the Camera SDK will hum along around 1% CPU utilization. Even with a few hundred objects per camera in Object Mode it should remain around 1% CPU--even in a Debug build.

If you have a simple project setup that you are convinced demonstrates otherwise please send it over and we will determine what is consuming CPU and why.

Thanks,
Doug
Attachments
2014-12-01 18_33_45-Windows Task Manager.png
2014-12-01 18_33_45-Windows Task Manager.png (78.51 KiB) Viewed 7447 times
maikelnai
Posts: 21
Joined: Thu May 22, 2014 1:25 am

Re: Computer performance affected with SDK

Post by maikelnai »

Hello beckdo,

is attached a simple Project where the problem can be seen; I have made the test using 8 cameras (and other different 8 cameras too). Are attached the screenshots where you can see the CPU utilization before, during without light, during with light and after the application runs.

I have removed CameraLibrary2008S.dll and CameraLibrary2008S.lib from the Project because its size. I'm using the OptiTrack_Camera_SDK_1.7.0_Final.exe version.
Thanks.
Attachments
SDKCameraTest.rar
VS2008 Project
(98.02 KiB) Downloaded 307 times
Screenshots.rar
Screenshoots
(190.95 KiB) Downloaded 302 times
beckdo
Posts: 520
Joined: Tue Jan 02, 2007 2:02 pm

Re: Computer performance affected with SDK

Post by beckdo »

Hey maikelnai,

Thanks for sending this over. When I compile & run this unmodified for either Debug or Release with my cameras in an attempt at a worst case scene (trying for hundreds of objects per camera) my CPU utilization barely tops 1% but most of the time it's listed at 0% (see attached). I think one of the most expensive things your application is doing is the printf statement with every frame received from every camera. I would recommend throttling or removing that. Judging by your screenshots it looks like you've already tried running with that.

I think we need to talk about what kind of machine specs the machine has that you are running this application on. One of the things about USB 2.0 is that there is some noticable CPU overhead just by the nature of having data transfer over USB 2.0 through the host controller and into userland code. This has been reduced significantly on more modern machines.

Thanks,
Doug
Attachments
lowcpu.png
lowcpu.png (87.58 KiB) Viewed 7428 times
maikelnai
Posts: 21
Joined: Thu May 22, 2014 1:25 am

Re: Computer performance affected with SDK

Post by maikelnai »

Hello Beckdo,

the data sheet of the computer is attached. It runs Windows 7 Professional 32 bits. The same computer was used with the previous version and it worked properly.

We connect:
- 4 cameras into one Optihub
- 4 cameras into the other Optihub
- The Optihubs to the computer
- The Optihubs trough them with a RCA cable for synchronization.

We have tested with two different computers and with the same results.

printf is called just when the callback is triggered; the most of the time this doesn't happen and the CPU consumption is around 20%.

Regards
maikelnai
Posts: 21
Joined: Thu May 22, 2014 1:25 am

Re: Computer performance affected with SDK

Post by maikelnai »

Excuse me; I forgot attach the file.

http://www.axiomtek.com/Download/Spec/e ... 831-fl.pdf

Regards,
NaturalPoint-Dustin
Posts: 609
Joined: Tue Mar 19, 2013 5:03 pm

Re: Computer performance affected with SDK

Post by NaturalPoint-Dustin »

I am not surprised by the excess CPU load with the system specifications you provided. For an 8 camera system we recommend using an Intel 2.5+ GHz i5.
Dustin
Technical Support Engineer
OptiTrack | TrackIR | SmartNav
beckdo
Posts: 520
Joined: Tue Jan 02, 2007 2:02 pm

Re: Computer performance affected with SDK

Post by beckdo »

Hey maikelnai,

I think you've demonstrated the while it's possible for Intel Atom based machines to utilize the Camera SDK, it's also very under-powered for this type of application.

I don't have many suggestions for you if you're required to use this as your platform. Perhaps among the few options you have is to reduce the number of cameras or reduce the frame rate. It's very possible that our Ethernet cameras are capable of running at lower CPU utilization because they use sockets to communicate instead of the USB Host Controller + USB driver architecture for communication.

Good Luck,
Doug
maikelnai
Posts: 21
Joined: Thu May 22, 2014 1:25 am

Re: Computer performance affected with SDK

Post by maikelnai »

OK,
thank you for your support.

But with the COM interface version of the SDK, the same application ran on that computer with a insignificant CPU consumption.

Regards,
Post Reply