You can reproduce this issue in the RGBVideoEnhancement plugin.
Remove the code in the RGBVideoEnhancementEvent, only retrieve the image and dispose of it. If we run one 4k stream@25 fps on an i7 4770 (Nvidia 1060) everything works ok. We get all the frames. But if we run two instances of the plugin we don’t get all the frames. Typically 10 frames are dropped in a row and never reaches the BitmapLiveSourceEvent. Instead of getting 25 fps we only get 15. The CPU-load is only about 20%.
It doesn’t seem to do a difference if we set _bitmapLiveSource.SingleFrameQueue to true or false.
We have tested the RawLiveSource event and then we get all the images but in that case we have to do the decoding ourselves with everything that brings, licensing etc. That’s something we want to avoid.
The question is if there’s a way to prevent images to be dropped?
or is there a better way to get the decoded images? Would we get better performance if we instead use the Media Toolkit in C++ or the Direct Show filter?
Best Regards
Mikael Ståle
Here is the code we run in the event:
BitmapLiveSourceLiveContentEvent(object sender, EventArgs e)
{
var args = e as LiveContentEventArgs;
if (args != null)
{
if (args.LiveContent != null)
{
var bitmapContent = args.LiveContent as LiveSourceBitmapContent;
if (bitmapContent != null)
{
bitmapContent.Dispose();
}
}
}
}