Currently, I’m retrieving images from the Image Server one at a time as JPEGs, and pushing these to an ffmpeg process to convert to an MP4. This seems a bit daft, given that the original stream is MP4. Is it possible to gain access to the original recorded stream via the Image Server API?
Using the Image Server protocol if you ask for footage setting
no
and make sure to omit the
node, you should get the raw camera stream. It will then depend on the camera setup whether you will get Jpeg, h254, h265 or something else.
The data will be encapsulated - The GenericByteData format
Please read - https://doc.developer.milestonesys.com/html/index.html?base=mipgenericbytedata/main.html&tree=tree_3.html
Many thanks. It’s not trivial is it!?
It’s a long shot, but is there somewhere that describes how to reconstruct a video stream for a particular time period (using protocol integration)?
I haven’t yet managed to work out how to tell whether I’ve been returned a “Video stream as individual coded pictures” or “Video blocks (sequences of coded pictures)”, so this could take a while…
Not trivial, no..
I believe that in playback (goto-next) you will get a gop (if the camera is setup to use a codec that uses gops) and in live you get a frame at a time.
I think the complexity of understanding the possible formats you can receive is the cause for not having a sample that does this. The sample TCPVideoViewer will ask for Jpegs for this reason. So apart from the documentation I have already supplied a link to Milestone Technical Support will not be able to help.
Thanks for the extra information - that’s useful to know. I’ll see how far I get! ![]()