I’m trying to retrieve archive video using the protocol integration to obtain raw camera data. I’ve set the value to no and have the set to 100. However, when using the TcpVideoViewer as an example, I do not know how to read the response. Following the example, the bytes of the Total length header fields is returning 0. Can you provide a good example of how to do this?
Note: For live video, there is no problem. It is just for archive video when reading a response for a GoTo command that is the issue.
Please be aware of this design - when receiving live video, there is one frame in each response, while in playback, the response to a GoTo will provide an entire GOP.
If no data is in the database, the length will be zero. Please as a first step ensure that the camera has been recorded. Per default there is recording on motion, if no motion no recording. You can change the rules that govern recording. Verification that there is recording can be done with the Smart Client in playback mode.
From your response that it will provide an entire GOP, I’m assuming that the data will be in Video Blocks instead of a Video Stream Packet, is that correct? If so, can the data in the Video Blocks be extracted and sent to FFmpeg to play the data?
Your assumption is correct, yes, video blocks / video stream packets.
Regarding the second question, I have asked Milestone Development and they said that if you strip off the 32 bytes generic byte data header, then FFmpeg library should work, as far as they know. We cannot validate it because we have not tested, but can you please test it if it works for you?
I’ve tried this but I do not think it is working correctly. I’ve tried to strip the header off (I tried 32 and 36 bytes) and it seems like it may read the first batch correctly, but when it asks for the next video archive frame from the response, it does not match the first byte received to be “I” so it does not parse the data. I am using the TcpVideoViewer example on how to process the responses.
Perhaps I should ask this in a different post but the reason I am asking this is because when I try to play archive video for a HD video stream, when it was playing individual video streams (compressionRate of 75), the response times take from 50 ms to almost 3000 ms for each frame to be received. This causes the playback to be really slow. So I thought may be if it was reading raw video data that it would be faster.
Note: Live video of the HD stream is fine and has no slowdown.
Please note that the GenericByteData header includes a length field, you will have to use this data to know how to strip the header off.
If analysing the data and knowing the GenericByteData format can you get closer? You should be able to determine if it is video data you get but perhaps it will require looking closer at the data at hand, tracing or dumping data for the analysis perhaps.
I am curious if you see the same delay testing with the Smart Client? If the Smart Client shows the same results you will not be able to do better by your own protocol integration.
Yes. I’ve tried to use the length field to strip off the header. However, the TcpVideoViewer sample application also does not provide an example of how to read the Video Block Data, only Video Stream data, so what I have is based on what I can decipher from the documentation. From what I can tell, it seems like it reads the first response and obtains video correctly, but subsequent responses do not provide expected data. I am hoping you guys can provide a solid example to follow for this approach.
As for the HD video archive delay, we do not see this with the Smart Client. However, we do see this in the TcpVideoViewer sample application. We have to use the protocol integration because we have our own video player that plays video based on camera data. We will not be able to use the component integration for this.
Actually the TCPVideoViewer does playback mode if you click on the times in the list in the left hand side when in playback mode. Note however that the sample does not support raw video format so it might still not show you this. The sample uses asking for JPEGs trans-coded server-side.
Can you document what data you get? I might have some colleagues that can advise but the data will when raw depend on the camera device so I think it needs to be analyzed.
I’ve attached a wireshark capture of trying to play back archive video.
If you look at packet 545 and 546, this is the first request and response of the video frame. Then at packet 2098 and 2099, that is the next request and response of the next frame of video. However, using the sample code to read and parse the response, it fails to correctly read the second response.
We asked Milestone Development to see your log and they said; as far as they can see the content is correct – but the RequestID is ‘0’ in both requests. We expect this number to be increased for every request.