How does the TCPViewer render images?

I’ve been trying to replicate the tcp viewer example on NodeJs (more specifically, Electron), and i’ve got really far, to the point where i can actually show some images on the screen, my question is, how does the canvas element on the TCPViewer render images?

So far, i’ve been able to stablish a TCP socket connection with the recording server, and started a live requests which sends me 1 of 2 responses possible for that kind of request, LivePackageMessage or GoTo response. After that i split the data by the bytes 13-10-13-10 so as to know when every response is finished, and following the TCPViewer example whenever the data starts with a 255 byte(0xFF) followed by a 216 byte (0xDB) i can be sure that what follows is an image. So by that point i try to draw the images inside a HTML 5 canvas element using a HTML5 img element.

This works, kind of fine, but not as near as good as the TCPViewer example, its really choppy and not fluid at all.

As far as i can tell, the TCPViewer example does something quite similar, but it works a lot better, the video looks really good.

Can some explain me why? how does the canvas element on the TCPViewer example work?

Thanks in advance

TCPVideoViewer sample is just assigning the received JPEG image to a standard WPF Image control, so the rendering is done by .NET entirely. Unfortunately we are not experts on how to do rendering in NodeJS, so we cannot provide more guidance on that part.