Stream doesn't seem to decode on non-accelerated client machines.

Hi!

We have implemented a driver with the Driver Framework for a body-worn camera. The livestream pulls through correctly on the Smart Client only when hardware acceleration is enabled.

We have tested on a few machines which does not support hardware acceleration, where the feed does not show. On a machine which does support hardware acceleration, the moment we turn off hardware acceleration, the feed does not work anymore, neither does the recorded stream.

What could cause this?

Kind regards,

Peter Peiser

Please make a small export in Milestone format available to us, this will allow us to analyze the video data.

Good day!

Here’s the small export as requested :slight_smile:

[PJH Technologies 4-13-2025 7-10-04 AM.zip](PJH Technologies 4-13-2025 7-10-04 AM.zip “PJH Technologies 4-13-2025 7-10-04 AM.zip”)

Password is “password” without quotes

Kind regards,

Peter Peiser

Updated the link to include full path

The developer analysis, in his words…

--

I have extracted the raw h264 data from the export that the customer did provide, so an MKV export is no longer needed.

I debugged the decoder, and now this is going to be a bit technical:

Any h264 video begins with a data structure that contains a byte (profile_idc) that indicates the decoding standard that the video adheres to, and for the video in question that number is 249:

Now, 249 is not a value that we recognize as a valid profile ID, so the decoder returns UMC_ERR_UNSUPPORTED. I looked in the specs for h264 to see if it is a new value that has been introduced since our decoder was implemented, and as far as I can tell it is not.

That leads us to the question of why the Intel and Nvidia hardware decoders will decode the video, and there we don’t have the luxury to debug right in to the source code, but one could guess that they are simply less strict about validating the profile_idc parameter, and make an assumption about what 249 means, which happens to be correct.

We looked up the Yulong BWC-R3H camera, and it seems to come from a Chinese company that has stopped sharing news on their official web page after the year 2020, which in the first place looks like it is basically a template that hasn’t been fully customized and finalized. I suspect it might be a bit difficult to get in touch with them and ask them what is going on.

Thanks so much for the feedback! They’re thankfully pretty responsive so I’ll do my best to work with them to get this sorted. Appreciate the help!

Is the format that Milestone expects in the H264 data field documented somewhere perhaps, ideally with an example? I’m de-encapsulating the stream and I suspect I’m doing it incorrectly.

Again in the words of the developer…

--

I’m only referring to the official specifications. It’s an extremely dry document that defines the specs with almost mathematical precision, and it doesn’t have any examples (https://www.itu.int/rec/T-REC-H.264-202408-I/en). Details about video decoding standards is not exactly a topic that is well covered with examples and tutorials online. However, paragraph 7.4.2.1.1 on page 76 gives some insight into how the profile_idc parameter is constructed. Notice that the last two bits are reserved zero bits, and the value 249 that we got from the attached file is an odd number, so one of the mandatory zero bits is actually set!

It is interesting that the paragraph describes individual bits, while the rest of the document only mentions specific values that the field can have, like on page 46 it says-

if( profile_idc = = 100 | | profile_idc = = 110 | |
 profile_idc = = 122 | | profile_idc = = 244 | | profile_idc = = 44 | |
  profile_idc = = 83 | | profile_idc = = 86 | | profile_idc = = 118 | |
   profile_idc = = 128 | | profile_idc = = 138 | | profile_idc = = 139 | |
    profile_idc = = 134 | | profile_idc = = 135 ) {

This suggests that only specific combinations of bits are allowed. The list of values that our decoder recognizes is this:

    enum
    {
        H264_PROFILE_CAVLC444           = 44,
        H264_PROFILE_BASELINE           = 66,
        H264_PROFILE_MAIN               = 77,
        H264_PROFILE_SCALABLE_BASELINE  = 83, // Annex G
        H264_PROFILE_SCALABLE_HIGH      = 86, // Annex G
        H264_PROFILE_EXTENDED           = 88,
        H264_PROFILE_HIGH               = 100,
        H264_PROFILE_HIGH10             = 110,
        H264_PROFILE_MULTIVIEW_HIGH     = 118, // Annex H
        H264_PROFILE_HIGH422            = 122,
        H264_PROFILE_STEREO_HIGH        = 128, // Annex H
        H264_PROFILE_HIGH444            = 244
    };

As you can see, there ARE values from the spec that are not in our list of supported profiles, but 249 is not one of them.

The developer told me that the details about the constraint_set flag is not correct in his reply, but maintains that profile_idc value 249 is not valid

Thanks so much for the feedback, I appreciate it!

Sorry for my ambiguity in the question.

I mean, do I just pop the H264 NAL units in the byte[] arrays into Milestone, starting with 00 00 00 01 or 00 00 01 ? If that’s not the case, I need to revisit my code.

If that’s not the case, a single byte array example of a correct data binary keyframe that XProtect expects would really help! I have the parseable H264 on my end, but I am unsure what headers and other encapsulation and the stuff around it should be when passing it to XProtect as H264 directly with the VideoCodecType.H264 flag.

Kind regards,

Peter Peiser

A guess from our side is that you do not put the right Generic Byte Data header on this.

Data…

Frame 0: 00 0A 05 50 00 01 00 00 01

Frame 1: 00 0A 05 51 00 00 00 00 01

Frame 2: 00 0A 05 52 00 00 00 00 01

Frame 3: 00 0A 05 53 00 00 00 00 01

Apologies for the lots of edits in the previous post.

This is an example of what I get for the first keyframe sent to XProtect. It seems correct on my end, making me thing maybe I should have another encapsulation rather than pure H264 NAL units? I have compared this to an ffmpeg remux to direct H264 as well.

The below is sent as a single entry into the byte array. I have tried many things, but it seems prepending the SPS and PPS to keyframes works better to decode with acceleration, maybe giving a hint as to what I’m doing wrong. I am working in the dark as I have no documentation regarding what XProtect expects with the H264 codec, which is why I’m asking.

As above, it seems I’m passing idc_profile 100 to it, which should be supported, so I’m assuming my headers are messed up.

An example hex dump of what I’m passing into XProtect’s byte array frame response for a keyframe’s start

I have accidentally found documentation for this.

I assume this is what I need before the data, as documented in the below link?

Given the example above, it seems I only need codec type / sequence number / flags?

https://doc.developer.milestonesys.com/html/reference/protocols/video_sub_formats.html

Perhaps I should have pointed to this - https://doc.developer.milestonesys.com/html/index.html?base=reference/protocols/genericbytedata.html&tree=tree_3.html

There should be a GenericByteHeader (32 bytes) first.

Prepending the GenericByteHeader did not work, and I note the PlaybackReadResponse as well as the GetLiveFrameResult methods already have entries for the header structures in the driver development framework.

I am unsure how to proceed here. If I could get a single example that worked what you expect in the data field, pretty sure it would get me through this problem.

Another option that might work is if you send me the extracted datastream, maybe? That might give me a hint as to what I’m incorrectly passing to the system. A working example, however, would be preferable as I then don’t need to guess.

Hi, I am not sure exactly what is going wrong, but maybe the following can assist in finding the root cause of the issue. Here is the code from our H.264 test driver’s GetLiveFrameInternal:

            DateTime frameTime = DateTime.UtcNow;
            var tmpHeader = new VideoHeader()
            { CodecType = VideoCodecType.H264 };
            data = connection.GetLiveFrame(out bool isKeyFrame);
            if (isKeyFrame)
            {
                tmpHeader.SyncFrame = true;
                _lastSyncTime = frameTime;
            }
            else
            {
                tmpHeader.SyncFrame = false;
            }
            System.Threading.Thread.Sleep(250);     // try 4FPS
 
            tmpHeader.Length = (ulong)data.Length;
            H264FrameNumber++;
            tmpHeader.SequenceNumber++;
            tmpHeader.TimestampSync = _lastSyncTime;
            tmpHeader.TimestampFrame = frameTime;
            header = tmpHeader;
            return true;
 In addition, I have attached one of the frames in raw binary format exactly as it is returned from this method, so you can see what it is we expect.

Let me know if this helps.

Best Regards,

Simon

Thanks SO much!! The rawvideo dump allowed me to figure out what’s wrong with the stream and fix it. The level_idc parameter in the sequence parameter set was incorrect in the H264 header. Updating that to the correct value fixed the problem.