Can you provide an example of how RawLiveSource can get video data?

I used RawLiveSource to fetch the stream in the plug-in, and the first time I fetched the video data of device 1 was normal;

When I turn off the fetch stream and reopen the fetch device 2 video data, I find that the stream data is not the correct frame data.

private void OpenLiveSession()
{
	_selectedItem = Configuration.Instance.GetItem(_viewItemManager.Config.DeviceId, Kind.Camera);
	if (_selectedItem == null)
		return;
 
	 _rawLiveSource = new RawLiveSource(_selectedItem);
	_rawLiveSource.Init();
	_rawLiveSource.LiveContentEvent += RawLiveSourceLiveContentEvent;
	_rawLiveSource.EnableMulticast = true;
	_rawLiveSource.LiveModeStart = true;
}
 
void RawLiveSourceLiveContentEvent(object sender, EventArgs e)
{
    try
    {
        if (!Dispatcher.CheckAccess())
        {
            // Make sure we execute on the UI thread before updating UI Controls
            Dispatcher.Invoke(new EventHandler(RawLiveSourceLiveContentEvent), new[] { sender, e });
        }
        else
        {
            var args = e as LiveContentRawEventArgs;
            if (args != null)
            {
                if (args.LiveContent != null)
                {
                    var rawContent = args.LiveContent as LiveSourceRawContent;
                    if (rawContent != null)
                    {
                        if (_stopLive || imageBox.Width == 0 || imageBox.Height == 0)
                        {
                            rawContent.Dispose();
                        }
                        else
                        {
                            int startIndex = 32;
                            int length = rawContent.Content.Length - startIndex;
                            byte[] dataSubset = new byte[length];
                            Array.Copy(rawContent.Content, startIndex, dataSubset, 0, length);
							handle(dataSubset);
                            rawContent.Dispose();
                        }
                    }
                }
                else if (args.Exception != null)
                {
                }
            }
        }
    }
    catch (Exception ex)
    {
        EnvironmentManager.Instance.ExceptionDialog("BitmapLiveSourceLiveContentEvent", ex);
    }
}
 
 private void CloseLiveSession()
 {   
     if (_rawLiveSource != null)
     {
         _rawLiveSource.LiveContentEvent -= RawLiveSourceLiveContentEvent;
         _rawLiveSource.LiveModeStart = false;
         _rawLiveSource.Close();
         _rawLiveSource = null;
     }
 }
 
 private void ModeHandler(Mode newMode)
 {
     if (_rawLiveSource != null)
     {
         CloseLiveSession();
     }
     switch (newMode)
     {
         case Mode.ClientLive:
             OpenLiveSession();
             break;
 
         case Mode.ClientSetup:
             break;
     }
}

As above, when I first opened the LiveSession, I was able to get the correct data from the RawLiveSourceLiveContentEvent; Then I modify the device to trigger the ModeHandle, and the data I get from the RawLiveSourceLiveContentEvent is incorrect, the data header is correct, and the data of the body is incorrect, and it can’t be played normally.

Is this a plugin running in the Smart Client?

Are you always testing with the same camera, when it worked and when it now doesn’t?

We wonder if you can find out a little more and describe how it is incorrect? Specifically are you getting or missing a key frame or getting something that doesn’t look like video data at all?

Finally we have a hunch that the enabling of multicast might make a difference. Please as a test try to run not using multi-cast. Does it make a difference?