Image Server's Goto Endpoint always returns application/x-genericbytedata-octet-stream even though I'm specifying <alwaysstdjpeg>yes</alwaysstdjpeg>, <multipartdata>no</multipartdata> and a <transacode> element which resizes the image.

Context:

I want to get a series of JPEG images from the Image Server by specifying different times, so for example I may call the Goto endpoint 6 times, with 6 different values of typically 1000ms apart from each other…to get 6 different images of course.

My testing environment is XProtect Essential+ 2020 R2.

Problem:

The endpoint call is successful and I get a valid response, but the problem is that the content-type is always application/x-genericbytedata-octet-stream

This what my connect request looks like:

<?xml version="1.0" encoding="UTF-8"?>
<methodcall>
<requestid>0</requestid>
<methodname>connect</methodname>
<username>dummy</username>
<password>dummy</password>
<cameraid>${cameraID}</cameraid>
<alwaysstdjpeg>yes</alwaysstdjpeg>
<connectparam>id=${cameraID}&amp;connectiontoken=${token}</connectparam>
<clientcapabilities>
<privacymask>no</privacymask>
<privacymaskversion>0</privacymaskversion>
<multipartdata>no</multipartdata>
</clientcapabilities>
<transcode>
<allframes>no</allframes>
<width>640</width>
<height>480</height>
<keepaspectratio>no</keepaspectratio>
<allowupsizing>yes</allowupsizing>
</transcode>
</methodcall>

And the goto request itself:

<?xml version="1.0" encoding="UTF-8"?>
<methodcall>
<requestid>1</requestid>
<methodname>goto</methodname>
<time>${milisecondsSinceUnixEpoch}</time>
<compressionrate>75</compressionrate>
</methodcall>

Given all these parameters, the docs seem to suggest that I’ll always get an image/jpeg response, not application/x-genericbytedata-octet-stream.

I also confirmed that the response is not JPEG by checking the first two bytes of the body (for JPEGs they should always be 0xFF and 0xD8, which they are not in my case).

The response is always something like this:

ImageResponse

RequestId: 1

PrivacyMask: none

Prev: 1605662176129

Current: 1605662176129

Next: 1605662176169

Content-length: 67073

Content-type: application/x-genericbytedata-octet-stream

…rest of binary data

Perhaps I’m missing something?

Hello,

I’m actually facing exactly the same problem :slight_smile:

No matter what I change in the connect or live command (alwaysstdjpeg, compressions, …) , I always get this response type and don’t know how to tranform it into a video or anything that I can view.

I’m aware of the ‘binary format’ from the documentation but still cannot see how to read it.

I Hope you’ll get answer :slight_smile:

Thanks

Charly.

Hey Charly, yes the real issue is what you mentioned in the end.

Actually, I really don’t care if I get generic byte data as long as I can get a solid reliable method/algorithm for converting that generic byte data into a JPEG in my Node.js code.

The documentation on GenericByteData unfortunately is not very helpful and

the TcpVideoViewer Protocol sample which does this, well, it’s in C# and I’m having a very difficult time translating that to Node.js

Getting generic byte data is actually better since it doesn’t put a lot of CPU load on the VMS while transcoding.

Regardless, bottomline is that I need JPEG images by hook or by crook :slight_smile:

Yes, you are right :slight_smile:

We are working on that, on NodeJs too, I’ll keep you informed if I find any relevant solution.

Much appreciated!

Just FYI here’s where I am right now:

let { headers, imageBuffer } = await getGotoResponseBuffer(imageServerConnectedSocket, milisecondsSinceUnixEpoch - 1000)
 
const { contentType, contentLength } = parseGotoResponseHeaders(headers)
 
// IF it's a JPEG - JPEGs always start with FF D8
  if (imageBuffer[0] == 0xFF && imageBuffer[1] == 0xD8) {
    console.log('JPEG FOUND!')
  }
  else {
    const dataType = getSubBuffer(imageBuffer, 0, 2)
    const length = getSubBuffer(imageBuffer, 2, 4)
    const codec = getSubBuffer(imageBuffer, 6, 2)
    const seqNo = getSubBuffer(imageBuffer, 8, 2)
    const flags = getSubBuffer(imageBuffer, 10, 2)
    const timeStampSync = getSubBuffer(imageBuffer, 12, 8)
    const timeStampPicture = getSubBuffer(imageBuffer, 20, 8)
    const reserved = getSubBuffer(imageBuffer, 28, 4)
 
    console.log({
      dataType,
      length,
      codec,
      seqNo,
      flags,
      timeStampSync,
      timeStampPicture,
      reserved,
      imageBufferFirst32: imageBuffer.subarray(0, 60)
    })
 
    const lengthInGenericHeader = length.readUIntBE(0, length.byteLength)
    console.log('lengthInGenericHeader: ', lengthInGenericHeader)
    
    const fs = require('fs')
    const jpegBuffer = imageBuffer.subarray(32, lengthInGenericHeader)
    fs.writeFileSync('final3.jpg', jpegBuffer)
  }

headers is the string containing this part:

ImageResponse
RequestId: 1
PrivacyMask: none
Prev: 1605662176129
Current: 1605662176129
Next: 1605662176169
Content-length: 67073
Content-type: application/x-genericbytedata-octet-stream

imageBuffer initially contains all the data that’s supposed to be after the header i.e. the generic byte data, which I’m trying to parse based on the TcpVideoViewer C# sample.

I can confirm that contentType, contentLength are parsed successfully from headers.

But beyond this I just can’t seem to get the actual image. jpegBuffer is always written as an invalid image file.

The reasons can be many as you can imagine…for example I’m not sure at all whether the length I’m trying to get from the imageBuffer is valid or not, and whether I’m using the length value correctly.

You always get GenericByteData (https://doc.developer.milestonesys.com/html/index.html?base=mipgenericbytedata/main.html&tree=tree_3.html) but if you remove the header then you will have the jpeg format.

An alternative option, that you might find better, is to use Mobile SDK and you will get jpeg directly.

https://doc.developer.milestonesys.com/mipsdkmobile/index.html?base=content_0.html&tree=tree_home.html

That’s not what the documentation here says. It explicitly asks to use the parameters, that I listed, to retrieve jpeg in the response with the correct image/jpeg header.

Also, which header are you referring to? The HTTP response formatted header or the GenericByteData header?

I’ve tried both incidentally. And since it is all just a stream of bytes, how does one know where the GenericByteData header ends, any delimiter?

If you please have a look at the TcpVideoViewer sample, they’re following a different approach: parsing the GenericByteData header, getting a length value and using that somehow to get the JPEG. The code is in C# of course and a bit confusing, but more problematic is the lack of documentation. Again, the GenericByteData header properties and how to parse & use them is not mentioned in any doc I’ve seen so far.

Thank you for the Mobile SDK reference, I’m having a look at that as I write this. I’m guessing though that that can only be used to access the XProtect Mobile Server? If yes, from a reliability standpoint would you say it’s wise to use the Mobile SDK? What if the Mobile Server is down or shut off just because it’s not needed by the VMS users?

Image Server it seems to me, as I’m sure you’ll agree, is much more central to XProtect and will likely always be available and running.

Or maybe I’m missing something.

Hello !

The mobile server is another way to get the video stream in a web page for example and the SKD helps to do that and provided “converted” packets. I’ve an example if you want.

For the image server I have, I hope, good news :

I received the same messages from the socket server as you explained before, starting with "ImageResponse, RequestId : …’

These are the specific parameters I set in the connect query (I haven’t tried to change some of them to see the impact).

no

yes

200

200

yes

no

I still receive “Content-type: application/x-genericbytedata-octet-stream” even with this configuration.

But… :slight_smile:

I wrote those messages in a text file. I oppened them with Notepad++, removed the first lines (the header part) and the first blank line.

Save it and renamed it to ‘xxx.jpg’ and … I got my picture :slight_smile:

I also try to convert to base 64 back and force and could display it in a web page.

I think/believe that we receive JPEG content even if the content-type doesn’t say so.

Taking a closer look avec the C# example (TcpViewer) on the “Live” method, I understand that if the content start with “FF and D8” it means that we dont need to read the bytes to get data from the packet.

FYI, i try to attach the packet received and aother file with header removed.

Hope this helps.

Charly.

Thanks very much for this. Your Source message.txt worked similarly for me, I followed your steps and got a valid JPEG :slight_smile:

Unfortunately though it’s still not working for my code/environment.

Attaching the text file I’ve generated from this code:

async function getImageServerConnectedSocket(ip, port, token, cameraID, instanceIdUuid) {
  const connect = `<?xml version="1.0" encoding="UTF-8"?>
<methodcall>
<requestid>0</requestid>
<methodname>connect</methodname>
<username>dummy</username>
<password>dummy</password>
<cameraid>${cameraID}</cameraid>
<alwaysstdjpeg>yes</alwaysstdjpeg>
<connectparam>id=${cameraID}&amp;connectiontoken=${token}</connectparam>
<clientcapabilities>
<multipartdata>no</multipartdata>
</clientcapabilities>
<transcode>
<allframes>yes</allframes>
<width>640</width>
<height>480</height>
<keepaspectratio>yes</keepaspectratio>
<allowupsizing>no</allowupsizing>
</transcode>
</methodcall>`.replace(/(\r\n|\n|\r)/gm, "") + "\r\n\r\n"
 
  const imageServerClient = new net.Socket()
  imageServerClient.setTimeout(10000)
  imageServerClient.setKeepAlive(true)
  imageServerClient.setEncoding('binary')
 
  return new Promise((resolve, reject) => {
    imageServerClient.on("data", (data) => {
      const imageServerConnectResponseData = data
      console.log(`IMAGE SERVER - CONNECT RESPONSE DATA RECEIVED`)
  
      if (imageServerConnectResponseData.includes('<methodresponse>') &&
          imageServerConnectResponseData.includes('<methodname>connect</methodname>') &&
          imageServerConnectResponseData.includes('<connected>yes</connected>')) {
 
        imageServerClient.removeAllListeners("data")
        imageServerClient.removeAllListeners("error")
        imageServerClient.removeAllListeners("timeout")
        resolve(imageServerClient)
      }
      else {
        reject(`Error, could not connect to image server - ${imageServerConnectResponseData}`)
      }
    })
 
    imageServerClient.on('error', (err) => reject(`Error connecting to Image Server - ${err.stack.slice(0, 250)}`))
    imageServerClient.on("timeout", () => reject('Timeout connecting to Image Server'))
 
    imageServerClient.connect(port, ip, () => {
      console.log('IMAGE SERVER - CONNECTED')
      imageServerClient.write(connect, () => console.log('IMAGE SERVER - CONNECT BODY WRITTEN'))
    })
  })
}
 
async function getGotoResponseBuffer(imageServerConnectedSocket, milisecondsSinceUnixEpoch) {
  const goto = `<?xml version="1.0" encoding="UTF-8"?>
<methodcall>
<requestid>1</requestid>
<methodname>goto</methodname>
<time>${milisecondsSinceUnixEpoch}</time>
</methodcall>`.replace(/(\r\n|\n|\r)/gm, "") + "\r\n\r\n"
 
  let imageBuffer = null
  let headers = ''
  
  return new Promise((resolve, reject) => {
    imageServerConnectedSocket.on("data", (data) => {
      if (imageBuffer === null) {
        imageBuffer = data
      }
      else
      {
        imageBuffer = imageBuffer + data
      }
 
      console.log(`IMAGE SERVER - GOTO RESPONSE DATA BYTES RECEIVED`)
      
      if (data.endsWith(CRLF)) {
        console.log(`IMAGE SERVER - GOTO RESPONSE DATA BYTES ENDED`)
        resolve({ headers, imageBuffer })
        return
      }
    })
 
    imageServerConnectedSocket.on('error', (err) => reject(`Error connecting to Image Server while GOTOing - ${err.stack.slice(0, 250)}`))
    imageServerConnectedSocket.on("timeout", () => reject('Timeout connecting to Image Server while GOTOing'))
 
    imageServerConnectedSocket.write(goto, () => console.log('IMAGE SERVER - GOTO BODY WRITTEN'))
  })
}
 
 
// --------------------------------------------------------------------------------------------
 
 
const imageServerConnectedSocket = await getImageServerConnectedSocket(host, portImageServer, token, cameraID, instanceIdUuid)
  let { headers, imageBuffer } = await getGotoResponseBuffer(imageServerConnectedSocket, milisecondsSinceUnixEpoch - 1000)
 
let bodyPart = imageBuffer.split(CRLF)
  console.log(bodyPart.length)
  bodyPart = bodyPart[1] + CRLF + bodyPart[2]
  // IF it's a JPEG - JPEGs always start with FF D8
  if (bodyPart[0] == 0xFF && bodyPart[1] == 0xD8) {
    console.log('JPEG FOUND!')
  }
  else {
    const fs = require('fs')
 
    fs.writeFileSync('full-body-without-compression.txt', imageBuffer)
  }

By adding the rate attribute after I generated a separate file.

Strangely enough, they were exactly the same.

I tried different things like removing the header, removing the header + removing the next blank line, using ‘utf8’ encoding rather than ‘binary’ etc…but still after renaming it with a .jpg extension it’s an invalid/corrupt JPEG image that can’t be opened on my Macbook.

And yes, please do post the Mobile SDK example if you can.

Here is the other file :slight_smile:

If you remove the GenericByteData header you are back to the encapsulated media data. There is no delimiter but there is length information.

I’m assuming you mean that all the bytes after the GenericByteData header are the media data? If so, how may I remove the GenericByteData header?

Hi Usama,

Please find attached an example for using web SDK.

Edit the “app.js” file to set your credentials, server and camera ID.

I think you need to edit the mobile server config to avoid cross origin problems. (See this link : “How to host a sample” on mobile SDK docs)

Then just open the index page and you might see the video from the camera you defined.

Charly.

Thank you very much!

Hi Usama,

I finally solved the problem for reading the message and send it back to the front page using this help : https://github.com/aruntj/mjpeg-readable-stream/blob/master/index.html

I read the packet on the server side (small video resolution so one packet per frame, it may be different for bigger resolution), do the treatment to read all bytes, removes the header, and send back a buffer to the front which displays it as an image.

Hope this helps

Charly.

Charly, I just tried this and it works great.

I basically just reused the loop in my Node server and fed the data buffers to it that I was receiving from the TCP socket.

Thanks so much for sticking in this conversation. Really appreciate your time and effort. You’ve gotten rid of a major roadblock for me.

Hi @Usama Ashraf​ and @Charlies Boulan​ ,

I am facing the similar issue when i m trying to implement a java application.

I saved the data as a string to a txt file and attaching here.I am unable to find what sought of binary data is it and It doesnt seems to be like a jpg data after reading the sample files attached in your above conversation.Can you please help me out.