We have a driver that we have written which records audio and video with bodyworn cameras with live-stream and playback. However, it is not obvious how to add the audio stream to the PlaybackManager to me.
I saw the Microphone Stream Session is used to record live audio, however, for audio when the device is out of wifi range, it’s uncertain how we will get it into the system.
Yes, you need to implement your PlaybackManager so that it can handle both video and audio, e.g. by having two utility classes being used for video and audio respectively. We unfortunately don’t have a sample demonstrating it, but all the navigation methods should be pretty similar to the video ones and the ReadData should return data in similar manner as the GetLiveFrame of your microphone stream class.