Mobile SDK Audio Issues with Samples/Code

We are testing the windows uWP sample from the mobile SDK. We are able to get audio however we have 2 major issues.

#1. There is a considerable delay of buffering that occurs before we get audio.

Playing the url from windows media player or vlc has the same issue.

#2. We are also getting a nasty echo and distortion.

We can confirm in the webapp and xprotect client that the audio sounds fine.

can you advise how we can get a more real-time audio feed and how we can address the above issues?

can you provide an updated sample or more imporantly an IOS or Android sample to get live audio? as well as not get the distortion?

here is an example of our parameter requets:

var audioParams = new AudioParams()

       {

           ItemId = microphoneId,

           CompressionLvl = 99,

           StreamType = StreamParamsHelper.StreamType.Native,

           AudioEncoding = StreamParamsHelper.AudioEncoding.Mp3,

           SignalType = signalType,

           MethodType = StreamParamsHelper.MethodType.Push,

           StreamDataType = StreamParamsHelper.StreamDataType.Audio,

           StreamHeaders = StreamParamsHelper.StreamHeaders.AllPresent,

       };

Hi Josh,

You are right.

The audio is jerky.

In order to remove distortion and echo please try to change the parameters:

var audioParams = new AudioParams()

       {

           ItemId = microphoneId,

           CompressionLvl = 99,

           StreamType = StreamParamsHelper.StreamType.**Transcoded**,

           AudioEncoding = StreamParamsHelper.AudioEncoding.Mp3,

           SignalType = signalType,

           MethodType = StreamParamsHelper.MethodType.Push,

           StreamDataType = StreamParamsHelper.StreamDataType.Audio,

           StreamHeaders = StreamParamsHelper.StreamHeaders.**NoHeaders**,

       };

With those it should work well in VLC and WMP.

As for the delay, it depends on the particular player.

What could be tried is different value of quality.

It is tricky to change (because meaning of number is complex):

Valid values range 10 - 99.

Everything under 10 is set to 10.

Everything above 99 is set to 99.

Format of the parameter - two digit value AB.

A - digit that represents sampling rate.

Correspondence:

1B - 8 kHz

2B - 11025 Hz

3B - 12 kHz

4B - 16 kHz

5B - 22050 Hz

6B - 24 kHz

7B - 32 kHz

8B - 44.1 kHz

9B - 48 kHz

B - digit that represents number of channels as well as Average Bits per sample.

Correspondence:

A0 - A4 - Mono (One channel)

A5 - A9 - Stereo (Two channels)

B % 5 - Average bits per sample

0 - Minimum, 4 - maximum

B < 2 - 8 Bits per Sample

B >= 2 - 16 Bits per Sample

Different players react differently on the MP3 steam parameters (bandwidth, channels, bit-nes). You could try to play with them in order to achieve minimal delay.

Meanwhile we will try to fix the sample for the upcoming release.

There are other things in it that could be made better and much more clear.

Petar,

thanks for your response. we will test this.

can you shed more light on how the web interface is doing audio? we don’t see any delay there and would like to replicate that. i believe it is also using the mobile sdk interface?

Hi Josh,

To play audio in our web client we use Fetch API (https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) and MediaSource (https://developer.mozilla.org/en-US/docs/Web/API/MediaSource) for browsers that support them.

We do something similar to:

var mediaSource = new MediaSource;
var sourceBuffer;
 
mediaSource.addEventListener("sourceopen", sourceOpen);
 
var sourceOpen = function (event) {
        sourceBuffer = mediaSource.addSourceBuffer("audio/mpeg");
        sourceBuffer.mode = "sequence";
 
        fetch(url)
            // return 'ReadableStream' of 'response'
            .then((response) => { return response.body.getReader(); })
            .then(playStream)
            // do stuff when 'reader.closed', 'mediaSource' stream ended
            .then(signalStreamStopped)
	    // handle error
            .catch(function() {});
};
 
var playStream = function (reader)  {
    var processStream = function (data)  {
        if  (data.done)  {
            return;
        }
 
        if  (mediaSource.readyState == "open")  {
            sourceBuffer.appendBuffer(data.value);
        }
    }
 
    sourceBuffer.addEventListener("updateend", function ()  {  reader.read().then(processStream).catch(signalEndOfStream); });
 
     // start processing stream
     reader.read().then(processStream);
 
     if  (reader.closed)  {
            // read of stream is complete
            return reader.closed.then(signalEndOfStream);
     }
};
 
var signalEndOfStream = function () {
    // signal end of stream to 'mediaSource'
    if  (mediaSource.readyState == 'open') {
        mediaSource.endOfStream();
    }
 
     return mediaSource.readyState;
};

where ‘url’ is constructed from the response of RequestAudioStream command.

Petar - the settings you provided do seem to help with the distortion.

Did you see my request about Android or IOS samples with audio for us to review? Do you have anything you can provide for us to review for mobile IOS/Android?

Hi Josh,

Do you need samples for Xamarin (.NET based SDK) or you need Native APIs samples for the Audio ?

Petar,

Xamarin IOS and Android please!

Hi Josh,

I asked around but unfortunately no one has even tried to play audio trough the Xamarin.

I’ve asked our Native devs if there are some tricky points when audio is played with native iOS/Android players.

The answer was - in general no, just set the url to the player. But when we came up to the part with delay - answer were slightly different.

They confirmed there were some specifics of the players on both platforms. In general they tweaked some settings (on the players) concerning players buffering strategies. And is matter of fine-tuning to make them working stable and with minimal delay.

If those settings are not made the UX is what you have received - big delay - as players are not intended to play live streams.

So, if you are able to access native player trough Xamarin and chnage some of their parameters, you should be able to decrease the delay.

Peter, would it be possible to get more information on the “tweaks” your team made to the IOS player specifically, we are not having any luck fixing the delay. Can you provide some details on the specific components you are using on IOS as well as the settings/tweaks for buffering and delay?

Also, as you can see other apple devs also have similar issues:

https://forums.developer.apple.com/message/269161#269161

Here is part of the code that is used in the iOS application (Milestone Mobile app):

Player initialization:

   audioPlayer = AVPlayer.init(playerItem: nil)

   audioPlayer.automaticallyWaitsToMinimizeStalling = false

Playback start:

func playAudio(forURL audioURL: String) {

   let playerItem : AVPlayerItem = AVPlayerItem.init(url: URL.init(string: audioURL)!)

   playerItem.preferredForwardBufferDuration = 1.0

    audioPlayer.replaceCurrentItem(with: playerItem)

   self.perform([#selector](javascript:void\(0\); "#selector")(playAudioAfterDelay), with: nil, afterDelay: 0.8)

}

@objc private func playAudioAfterDelay() {

       audioPlayer.playImmediately(atRate: 1.0)

}

In the upcoming 2019 R1 release there will be samples for Android and iOS (both Native) exactly for the audio :slight_smile:

Peter,

We have downloaded the new SDK 2019R1 the audio doesn’t work at all with it (we are testing against 2018 R3). We have looked through the samples and don’t find any new samples for audio support. (there is PushToTalk sample but it sends audio to server, and doesn’t audio). The old samples don’t contain any new settings or any changes at first glance.

The error we get is WrongInputParameter when calling Connection.Audio.RequestAudioStream method. The AudioParams we used are the same that we used before. The audio works with the previous SDK (both on Android and iOS).

var audioParams = new AudioParams()
 
           {
 
               ItemId = microphoneId,
 
               CompressionLvl = 85,
 
               StreamType = StreamParamsHelper.StreamType.Transcoded,
 
               AudioEncoding = StreamParamsHelper.AudioEncoding.Mp3,
 
               SignalType = StreamParamsHelper.SignalType.Live,
 
               MethodType = StreamParamsHelper.MethodType.Push,
 
               StreamDataType = StreamParamsHelper.StreamDataType.Audio,
 
               StreamHeaders = StreamParamsHelper.StreamHeaders.NoHeaders,
 
           };
 
 
 
microphoneId is valid GUID for audio streaming. Video works fine.
 
The error stacktrace we get for method Connection.Audio.RequestAudioStream is VideoOS.Mobile.Portable.VideoChannel.Params.AudioParams.ConvertToCommand (VideoOS.Mobile.Portable.MetaChannel.Command& tCommand) [0x0002b] in <e14cabce96534eb0b33e2c8990c829d6>:0 

Hi Josh,

I can confirm that audio in latest MIP SDK Mobile doesn’t work against 2018 R3 and 2019 R1 mobile servers.

(Reproduced the same error - wrong input parameter. Actually the audio encoding is not taken into account for some reason.)

The interesting part is that code in the Main branch is working fine.

I continue the investigation and will come back to you.

Hi Josh,

I think I’ve found the problem.

It has been fixed in the Main branch but seems it wasn’t in the release one.

It could be worked around with parameters change:

var audioParams = new AudioParams()
           {
               // ItemId = microphoneId,
               ItemsIds = new List<Guid>() { microphoneId},
               CompressionLvl = 85,
               StreamType = StreamParamsHelper.StreamType.Transcoded,
               AudioEncoding = StreamParamsHelper.AudioEncoding.Mp3,
               SignalType = StreamParamsHelper.SignalType.Live,
               MethodType = StreamParamsHelper.MethodType.Push,
               StreamDataType = StreamParamsHelper.StreamDataType.Audio,
               StreamHeaders = StreamParamsHelper.StreamHeaders.NoHeaders,
           };

I’ve tested it against 12.3 and 13.1 servers and worked on my side.

Meanwhile will try to research how to make hot-fix on the just released MIP SDK.

Patched version of the SDK could be downloaded from here:

http://download.milestonesys.com/Mobiles/XPMobileSdk_2019R1_SP1.zip

With it doesn’t need to be changed old working code.

Petar,

in testing it is working but there is still a significant start delay and a delay from live of about 5-10 seconds. Can you please check into this?

also, there doesn’t appear to be an android sample with audio?

Hi Josh,

I cannot comment on the audio delay, especially when Xamarin is used.

We are testing with UWP internally.

Josh, seems you are right,

For some reason neither Android nor iOS audio samples are included in the SDK release.

We will investigate.

Thanks for the catch.

Petar,

is there any update on getting the audio samples? also, can you confirm if those samples are experiencing the 5+ second delay?

Hello,

We will add audio samples in the next SDK release; until then you can get some of them (for iOS and Android) here:

Hope that helps.

We have not experienced the 5 seconds delay with the samples.