MIP driver

I want to implement custom event ingestion into milestone utilizing MIP driver, so the sample doesn’t specify what communication protocol does the MIP driver supports.

and the other thing the events implemented is the most simple one with no class just the bounding box (that’s what I have seen in the code maybe there’s more idk).

So, I also need to add more attributes or so and I know it should be ONVIF compatible format so it can be ingested into milestone.

is there more guidance I can get to do so ?

Sometimes it cause me confusion that we have two different technologies with (almost) the same name.

We have a MIP Driver developed at Milestone and samples like the Bounding Box Metadata Provider showcase how to use this to create metadata devices.

We have a MIP Driver Framework, where you instead develop your own driver, this driver can support Metadata devices.

Which of the two are you looking at?

PS. Outside the SDK this document might be highly relevant: https://doc.milestonesys.com/latest/en-US/portal/htm/chapter-page-metadatasearch-integration.htm

okay I’m now using the BoundingBoxMetadataProvider component sample and it’s pushing the bounding boxes to the live and I’ve made some edits according to the PDF description. It worked for people class but the attributes does not appear in search so if there attributes is added as multiple ClassCandidate or what?

and for the Vehicles also does not appear in the search when I search.

and there’s some sort of delay it could take sometime to appear in search but when I playback the metadata from the metadata playback it does appear.

I think we will need to understand the edits, what did you change?

I guess you made one kind of changes for people and something else for vehicles, and I think we need you to describe or document what you end up sending.

When it works, after a delay, how big a delay is it?

const int objectId = 1;
                var metadata = new MetadataStream
                {
                    VideoAnalyticsItems =
                    {
                        new VideoAnalytics
                        {
                            Frames =
                            {
                                new Frame(DateTime.UtcNow)
                                {
                                    Objects =
                                    {
                                        new OnvifObject(objectId)
                                        {
                                            Appearance = new Appearance
                                            {
                                                Shape = new Shape
                                                {
                                                    BoundingBox = boundingBox,
                                                    CenterOfGravity = dummyCenterOfGravity
                                                },
                                                Class =new OnvifClass {
                                                    ClassCandidates=
                                                    {
                                                        new ClassCandidate
                                                        {
                                                            Type = "People"
                                                        }
                                                    }
                                                },
                                                Description = description
                                            }
                                        },
                                        staticObject
                                    }
                                }
                            }
                        }
                    }
                };

I think it takes couple of minutes maybe 5-15 minutes I’m not really sure.

these are the edits worked to add people but I don’t know where to add the people attribute and vehicle worked it was vehicle not vehicles so the main point is how to add attributes like clothes and clothes color and so.

There is another sample metadata Playback Viewer to see metadata recorded on devices https://doc.developer.milestonesys.com/html/samples/ComponentSamples/MetadataPlaybackViewer/README.html

I will also recommend getting StableFPS https://doc.developer.milestonesys.com/html/gettingstarted/StableFPS.html where you can see example of metadata for human/people and vehicles and search for these.

Yes, you will have multiple class/classcandidate, example:

          <tt:Class>
            <tt:Type Likelihood="1.0">Human</tt:Type>
            <tt:Type Likelihood="1.0">HumanFace</tt:Type>
          </tt:Class>

And example of a complete object with attributes like shoe sandal in the metadata could be:

      <tt:Object ObjectId="56">
        <tt:Appearance>
          <tt:Shape>
            <tt:BoundingBox left="-0.6230390499288075" top="1.117771855296856" right="-0.3671225072725576" bottom="0.6628091127968561" />
            <tt:CenterOfGravity x="-0.49508077860068256" y="0.8902904840468561" />
            <tt:Extension>
              <BoundingBoxAppearance>
                <Fill color="#00FFFF00" />
                <Line color="#FF008000" />
              </BoundingBoxAppearance>
            </tt:Extension>
          </tt:Shape>
          <tt:Class>
            <tt:Type Likelihood="1.0">Human</tt:Type>
            <tt:Type Likelihood="1.0">HumanFace</tt:Type>
          </tt:Class>
          <tt:HumanFace>
            <fc:Gender>Female</fc:Gender>
            <fc:Age>
              <tt:Min>39</tt:Min>
              <tt:Max>49</tt:Max>
            </fc:Age>
            <fc:Hair>
              <fc:Style>Bald</fc:Style>
            </fc:Hair>
            <fc:Accessory>
              <fc:Opticals>
                <fc:Wear>True</fc:Wear>
                <fc:Color>
                  <tt:ColorCluster>
                    <tt:Color X="153" Y="102" Z="51" />
                  </tt:ColorCluster>
                </fc:Color>
              </fc:Opticals>
            </fc:Accessory>
          </tt:HumanFace>
          <tt:HumanBody>
            <bd:BodyMetric>
              <bd:Height>163</bd:Height>
              <bd:BodyShape>Thin</bd:BodyShape>
            </bd:BodyMetric>
            <bd:Clothing>
              <bd:Tops>
                <bd:Category>Sleeveless</bd:Category>
                <bd:Color>
                  <tt:ColorCluster>
                    <tt:Color X="144" Y="149" Z="100" />
                    <tt:Weight>1.0</tt:Weight>
                  </tt:ColorCluster>
                </bd:Color>
                <bd:Grain>Other</bd:Grain>
                <bd:Style>Dress</bd:Style>
              </bd:Tops>
              <bd:Bottoms>
                <bd:Category>Other</bd:Category>
                <bd:Color>
                  <tt:ColorCluster>
                    <tt:Color X="183" Y="224" Z="204" />
                  </tt:ColorCluster>
                </bd:Color>
                <bd:Grain>Plaid</bd:Grain>
                <bd:Style>Other</bd:Style>
              </bd:Bottoms>
              <bd:Shoes>
                <bd:Category>Sandal</bd:Category>
                <bd:Color>
                  <tt:ColorCluster>
                    <tt:Color X="119" Y="160" Z="63" />
                  </tt:ColorCluster>
                </bd:Color>
              </bd:Shoes>
            </bd:Clothing>
            <bd:Belonging>
            </bd:Belonging>
          </tt:HumanBody>
        </tt:Appearance>
        <tt:Behaviour>
          <tt:Speed>1.5898731499999998</tt:Speed>
        </tt:Behaviour>
      </tt:Object>

Mainly, the point is to deal with the smart search in milestone

So, this is the point that we want to focus on

When i try the peace of the code that affoard to you

I got this XML, and when I use the metadataplayback this retrives to me very well and also appears the people in the smart search

[The Output.XML]

<?xml version="1.0" encoding="utf-8"?>People0A text

with several

line breaks</tt:Extension></tt:Appearance></tt:Object><tt:Object ObjectId=“2”>tt:Appearancett:Shape<tt:BoundingBox bottom=“-0.25” top=“0.25” right=“0.25” left=“-0.25” /></tt:Shape>tt:ExtensionThis is just text</tt:Extension></tt:Appearance></tt:Object></tt:Frame></tt:VideoAnalytics></tt:MetadataStream>

But the point now, is hhow to add the attributes to the class candiadte

cause the OnvifClass contains only a ClassCandiadtes and the ClassCandidates contains ClassCandiadte

so how to create the properties in the XML for the people, Cause I want to use smart search and after that I want to filter using Bag or something

so please can you affoard to me a piece of code to make the solution very clear.

I have looked into this, and it appears that the VideoOS.Platform.Metadata namespace does not include classes specifically designed to generate ONVIF-based metadata for people, vehicles, or their attributes.

That said, you can still make use of the sample provided. However, instead of passing an instance of the MetadataStream class, you willl need to supply the ONVIF XML representing a frame as a string to the _metadataProviderChannel.QueueMetadata method.

Actually, I try this solution, Cause I found somethig like you said in

the samples that are provided, the sample is MultiChannelMetadataProviderand I update a little somethings in it like that I update the string

also I add line to save this string as XML file to check about the format

when I run the file saved succesfully and when I test about the metadata

I test through the metadata playback, there is nothing appears like there is nothing happened

       // Updated XML template to match the ONVIF structure with Frame and Object
 
       private readonly string _peopleCountXml = @"<?xml version=""1.0"" encoding=""UTF-8""?>" + Environment.NewLine +
 
           @"<tt:MetadataStream xmlns:tt=""http://www.onvif.org/ver10/schema"">" + Environment.NewLine +
 
           @" <tt:VideoAnalytics>" + Environment.NewLine +
 
           @"   <tt:Frame UtcTime=""{2}"">" + Environment.NewLine +
 
           @"     <tt:Object ObjectId=""{0}"">" + Environment.NewLine +
 
           @"       <tt:Appearance>" + Environment.NewLine +
 
           @"         <tt:Shape>" + Environment.NewLine +
 
           @"           <tt:BoundingBox bottom=""0.25"" top=""0.7500001"" right=""-0.04999997"" left=""-0.55""/>" + Environment.NewLine +
 
           @"           <tt:CenterOfGravity x=""0"" y=""0""/>" + Environment.NewLine +
 
           @"         </tt:Shape>" + Environment.NewLine +
 
           @"         <tt:Class>" + Environment.NewLine +
 
           @"           <tt:ClassCandidate>" + Environment.NewLine +
 
           @"             <tt:Type>People</tt:Type>" + Environment.NewLine +
 
           @"             <tt:Likelihood>0</tt:Likelihood>" + Environment.NewLine +
 
           @"           </tt:ClassCandidate>" + Environment.NewLine +
 
           @"         </tt:Class>" + Environment.NewLine +
 
           @"         <tt:Extension>" + Environment.NewLine +
 
           @"           <Description x=""-0.3"" y=""0.8200001"" size=""0.07"" fontFamily=""Times New Roman"" color=""#FFF0F8FF""></Description>" + Environment.NewLine +
 
           @"         </tt:Extension>" + Environment.NewLine +
 
           @"         <tt:Extension>" + Environment.NewLine +
 
           @"           <Properties>" + Environment.NewLine +
 
           @"             <Property name=""Age"">{1}</Property>" + Environment.NewLine +           @"           </Properties>" + Environment.NewLine +
 
           @"         </tt:Extension>" + Environment.NewLine +
 
           @"       </tt:Appearance>" + Environment.NewLine +
 
           @"     </tt:Object>" + Environment.NewLine +
 
           @"   </tt:Frame>" + Environment.NewLine +
 
           @" </tt:VideoAnalytics>" + Environment.NewLine +
 
           @"</tt:MetadataStream>";       // Updated SendNonStandardData method to use the new template
 
       private void SendNonStandardData()
 
       {
 
           var peopleCount = _random.Next(0, 100);
 
           var age = 24;
 
           var objectId = 20;//_random.Next(1, 1000); // Generate random object ID
 
           var utcTime = DateTime.UtcNow.ToString("yyyy-MM-ddTHH:mm:ss.fffffffZ"); // ISO 8601 format           // Format: {0} = ObjectId, {1} = PeopleCount, {2} = UtcTime
 
           var metadata = string.Format(_peopleCountXml, objectId,age, utcTime);
 
           // Save metadata to XML file
 
           try
 
           {               File.WriteAllText("latest_nonstandard_metadata.xml", metadata);           }
 
           catch (Exception ex)
 
           {
 
               Trace.WriteLine($"Error saving non-standard metadata to file: {ex.Message}");
 
           }           var result = _nonStandardProviderChannel.QueueMetadata(metadata, DateTime.UtcNow);
 
           if (result == false)
 
               Trace.WriteLine(string.Format("{0}: Failed to write to non-standard channel", DateTime.UtcNow));
 
           else
 
           {
 
               BeginInvoke(new MethodInvoker(() =>
 
               {
 
                   if (sourceComboBox.SelectedIndex == 2)
 
                   {
 
                       DisplayMetadata(metadata);
 
                   }               }));
 
           }
 
       }
 

The MultiChannelMetadataProvide sample suggest inspecting the metadata using Metadata Live Viewer. Do you get any result there?

If you do get metadata in the live viewer and not in playback viewer this indicates that the metadata is not saved - which is a requirement to find it from search.

Additionally search can only search on metadata devices related to cameras, which might also be a restriction using this sample. You can relate the metadata to a camera from the Client tab of the camera device properties in Management Client.