Clarifications on default video buffer setting in SmartClient and smoothBufferSize in SDK's SetVideoQuality member function

Dear Milestone,

we are currently developing a client software that uses the VideoOS.Platform.Client.ImageViewerWpfControl class’ SetVideoQuality (int smoothBufferSize, int cpusToUtilize) function inside the VideoOS.Platform.Client Namespace of Milestone’s sdk to play livestream video.

The problem we currently face is that the streamed video suffers from severe jittering when played in our client program, regardless of how we set the smoothBufferSize.

Live video can be played smoothly in SmartClient if the default video buffer is set to Maximum(2 seconds), but setting the smoothBufferSize using sdk code seems to have no effect on mitigating the jittering problem. Since our Milestone server and SmartClient and our custom client software all run within the same network, this would suggest network congestion is not the cause here.

Could Milestone assist us in troubleshooting these issues? And also elaborate on what the smoothBufferSize value is intended to affect, in case these issues stem from our lack of understanding of what the sdk code is actually supposed to do?

sincerely,

MPJ MiTAC

Please see this reference -

https://doc.developer.milestonesys.com/html/index.html?base=miphelp/class_video_o_s_1_1_platform_1_1_client_1_1_image_viewer_wpf_control.html&tree=tree_search.html?search=smoothbuffersize

Also, please try .SetVideoQuality( 100, 1); this should be the optimal (for largest buffering) and correspond with the largest setting in the Smart Client buffering settings.

Hello,

thanks for the prompt reply, but as mentioned in our first post, we have already tried tweaking the parameters in the SetVideoQuality() function, to no avail for alleviating the jittering and lagging problem. We have pored through the sdk documentation thoroughly in the process of writing our code and found some of its explainations insufficiently informative.

We have two follow up questions which we hope Milestone could also assist in answering:

  1. In the SetVideoQuality() function, does setting the smoothBufferSize parameter to 100 equate to setting the default video buffer to Maximum(2 seconds) in SmartClient? If not, are there methods available in the sdk that allows for setting a corresponding configuration to SmartClient’s default video buffer?
  2. The machine our client code runs on uses a 10-core cpu. Should the (int CPUsToUtilize) parameter not be set accordingly? Would setting this parameter to 1 leave other cpu cores unutilized and cause lower performance?

Thanks in advance.

sincerely,

MPJ MiTAC

  1.  Yes, and there are no other methods.
    
  2.  To set it as 1 is best. In theory there are scenarios where sharing the load on as many CPUs is possible would increase performance, I have a feeling that the extra expense of managing many CPUs and merging data will be too costly on most scenarios and my guess is that 1 is optimal for that reason. Note that if you have many Image Viewer controls each can use one CPU, they will not try to use the same one.
    

Best advise! Make sure you use a good graphic card and utilize Hardware Acceleration. This helps for viewing images smoothly and you might not have to introduce a buffer and the latency the buffer will cause.