var videoParams = new VideoParams
{
CameraId = cameraId,
DestWidth = destinationWidth,
DestHeight = destinationHeight,
CompressionLvl = 83,
FPS = 15,
RequestSize = true,
MethodType = StreamParamsHelper.MethodType.Push,
SignalType = StreamParamsHelper.SignalType.Live,
StreamType = StreamParamsHelper.StreamType.FragmentedMP4,
KeyFramesOnly = false
};
When I requesting stream with StreamType = StreamParamsHelper.StreamType.FragmentedMP4 then DestWidth and DestHeight is ignored and I’m getting the full video resolution from camera.
When I requesting stream with StreamType = StreamParamsHelper.StreamType.Transcoded, it works as expected and resolution of stream is set based on DestWidth and DestHeight.
Is this ok or am I doing something wrong for FragmentedMP4?
Also when I call changeStream method to change the video resolution, it works only for transcoded video. It does not do anything for FragmentedMP4.