Hi!We, Axis, use the BitmapLiveSourceEvent to get the images into our dewarping plugin. We have noticed some strange behavior when trying to dewarp multiple high-resolution streams.

You can reproduce this issue in the RGBVideoEnhancement plugin.

Remove the code in the RGBVideoEnhancementEvent, only retrieve the image and dispose of it. If we run one 4k stream@25 fps on an i7 4770 (Nvidia 1060) everything works ok. We get all the frames. But if we run two instances of the plugin we don’t get all the frames. Typically 10 frames are dropped in a row and never reaches the BitmapLiveSourceEvent. Instead of getting 25 fps we only get 15. The CPU-load is only about 20%.

It doesn’t seem to do a difference if we set _bitmapLiveSource.SingleFrameQueue to true or false.

We have tested the RawLiveSource event and then we get all the images but in that case we have to do the decoding ourselves with everything that brings, licensing etc. That’s something we want to avoid.

The question is if there’s a way to prevent images to be dropped?

or is there a better way to get the decoded images? Would we get better performance if we instead use the Media Toolkit in C++ or the Direct Show filter?

Best Regards

Mikael Ståle

Here is the code we run in the event:

BitmapLiveSourceLiveContentEvent(object sender, EventArgs e)

{

var args = e as LiveContentEventArgs;

if (args != null)

{

if (args.LiveContent != null)

{

var bitmapContent = args.LiveContent as LiveSourceBitmapContent;

if (bitmapContent != null)

{

bitmapContent.Dispose();

}

}

}

}

I will attempt to reproduce your observations and get back with feedback..

Thanks Bo!

However I forgot to add that you need to change _bitmapLiveSource.Width and _bitmapLiveSource.Height to 2880 or whatever 4k-resolution you test with.

We just tried with a better CPU, i7 8700, and then we can run 2 plugin instances without problem but when we add a third we no longer get all the frames. The cpu-load is under 50%.

The GPU can decode 75 4K images per second, probably a lot more. However using the BitmapLiveSource with no scaling on the images you are transferring 75 4K images in bitmap format per second from GPU to CPU, memory copy, this is I believe more than the PC can deliver even if the GPU and CPU does not show high usage.

Thanks for the answer Bo. Is there a way we can prevent the frames from being dropped? Even if it takes too long time we would like to get all frames, then we can drop them if needed.

What do you think about the Media Toolkit in C++ or the Direct Show filter. Would that give us better performance?