Description
In our product that utilizes Zoom Video SDK, we use separate/different canvas elements for each peer video stream. Recently it was noticed that zoom’s own example uses the same canvas for all peers.
The question is do multiple canvases have any known ramifications? Should we rewrite our code to use a single canvas?
Jumping in Seth’s behalf…
While trying to render multiple video on canvas we discovered problem we seems can not overcome. renderVideo seems does not respect coordinates passed to it and renders video on whole canvas.
See image below. Canvas size is 800x800, expected result that video will be rendered in top-left half of it. In reality it takes whole canvas as you can see.
Could you maybe point us what do we do wrong?
(edit: we did update to 1.1.4 before trying that)
(edit2: found this following property and it returns false in our code. Any ideas?)
/**
Whether the browser is support render multiple videos simultaneously
*/
function isSupportMultipleVideos(): boolean;
We had pretty much exactly the same set up as you described. As I mentioned it was noticed later that this function Stream | @zoom/videosdk
returns ‘false’ for us and does not work. When we took zoom demo react app from github there both rendering to canvas works and issupportmultiplevideos returns true.
It seems like these are related, could you clarify?
@vic.yang , do you know why stream.isSupportMultipleVideos() is true in the React sample app, but false in purejs-demo or other implementations (on Chrome)?
In react sample app, wepack devserver returns the correct CORP headers, but in purejs demo, it doesn’t. Rendering multiple videos requires SharedArrayBuffer.