We are needing to get access to the audio coming out of Milestone and have attempted to use the AudioPlayerContorl in the audio demo. however, there is no way to access the buffer or the audio, and it seems it only plays the audio. Is there any way to access this? There appears to be a non .net property for setting the default rendered but we can’t access from .net. Any suggestions?
Using the AudioPlayerControl you do not get access to the data, it is played directly. To my knowledge you cannot workaround this limitation of the control.
Using RawLiveSource you can get the data
As a quick test I put the following code in the AudioDemo sample ->>
// _audioPlayerControl.Connect();
_rawLive = new RawLiveSource(_selectedMic);
_rawLive.LiveContentEvent += _rawLive_LiveContentEvent;
_rawLive.Init();
_rawLive.LiveModeStart = true;
}
}
private void _rawLive_LiveContentEvent(object sender, EventArgs e)
{
LiveContentRawEventArgs le = e as LiveContentRawEventArgs;
byte[] by = le.LiveContent.Content;
Debug.WriteLine("length " + by.Length);
}
As you can see I did not know how to interpret the audio data but I could see that it probably is the correct data..
For the modification to the sample to work you need also -
- Add as a reference the VideoOS.Platform.SDK.Media.dll
- Copy dependent dlls using CopyMedia.bat
- Put the following in the application for initialization
- VideoOS.Platform.SDK.Environment.Initialize();
- VideoOS.Platform.SDK.UI.Environment.Initialize();
- VideoOS.Platform.SDK.Media.Environment.Initialize();
Bo, thanks for the quick reply. We will go ahead and try this. I am assuming based on the documentation that the payload will be the audio as sent directly from the camera, is that correct? Is there any timestamp info coming with this as well?
Also, can we do the same for live video? Do you have an example of that? For the video, does it pull the primary stream? Can we access the secondary stream using this method? Again, I am going to assume the payload will be whatever is coming from the camera?
The Milestone design is that the camera is only communicating with the recording server, all clients will then get the video and audio data from the recording server, this protects and shields the cameras. The data is unaltered when you get it in the RawLiveSource class. This works for video as for audio.
There is some technical information in the MIP Documentation you might find interesting, try to lookup: “The GenericByteData format”
My answer is not complete. On picking stream see this
MIP Environment: VideoOS.Platform.Live.RawLiveSource Class Reference
Guid VideoOS.Platform.Live.RawLiveSource.StreamId [get, set]
Can be used to identify which stream to use, when more than one is available. Multiple streams are available in XProtect Corporate. When set to Guid.Empty, the default stream will be used.