Render multiple videos on IOS (Safari, Chrome), Android (Chrome), Desktop (Chome, Edge, Safari)

Video SDK Type and Version

VideoSDK(Web): 1.11.10

Description

I am working on rendering multiple videos on various devices, including mobile and desktop web platforms:

On mobile: 1 myself + 2 participants

  • IOS: Safari, Chrome
  • Android: Chrome

On Desktop: 1 myself + 4 participants

  • Window: Chrome, Edge
  • MacOS: Chrome, Safari

I aim to achieve the best performance when rendering these participant videos. The documentation suggests rendering all participant videos on a single canvas to optimize CPU usage and browser performance. However, it seems I can’t render multiple videos on a single canvas with some devices, as mentioned in these threads:

Can you help me determine the best methods to use for each platform?

  1. Which platforms support rendering multiple videos on the same canvas? (Is there a way to programmatically check this capability?)

  2. For browsers that do not support rendering multiple videos on a single canvas, what are the best practices to achieve optimal performance? ( Should I use multiple canvases or render multiple HTML video elements instead?)

I would greatly appreciate any assistance you can provide as soon as possible. Thank you!

Devices

  • IOS: Safari, Chrome
  • Android: Chrome
  • Window: Chrome, Edge
  • MacOS: Chrome, Safari

Hey @lmtruong1512

Thanks for your feedback.

We have a clear method to indicate whether rendering multiple videos is supported: stream.isSupportMultipleVideos().

On our browser support page, there is a table that details the support for various platforms.

Currently, this depends on whether SharedArrayBuffer is enabled on the page and whether the device is low-performance (CPU logical cores <= 2). If SharedArrayBuffer is not enabled, you can enable it by specifying the enforceMultipleVideos option in the client.init method.

Thanks
Vic

1 Like

@vic.yang

Thanks for your reply.

1, Do you mean I can use stream.isSupportMultipleVideos() to check whether I can render multiple videos on the same canvas? as shown here: https://developers.zoom.us/docs/video-sdk/web/video/#render-multiple-participant-videos.

Additionally, are other conditions, such as whether SharedArrayBuffer is enabled (or if enforceMultipleVideos is set to true) and if the device is low-performance (CPU logical cores <= 2), handled within isSupportMultipleVideos()?

Or do I need to check these manually like this:

stream.isSupportMultipleVideos() &&
(SharedArrayBuffer is enabled || enforceMultipleVideos is true) &&
CPU logical cores > 2

2, Also, I want to ask: When rendering multiple videos, should we use individual HTML elements or an HTML canvas? Do both methods offer the same performance?

Hey @lmtruong1512

Do you mean I can use stream.isSupportMultipleVideos() to check whether I can render multiple videos on the same canvas?

Correct. Just use this method to check.

As for the SharedArrayBuffer and logical CPU cores I mentioned, I was just describing the internal checks we perform. You don’t need to do any additional checks on your codes.

When rendering multiple videos, should we use individual HTML elements or an HTML canvas? Do both methods offer the same performance?

The performance is the same. HTML elements is a new approach we’ve introduced to simplify rendering calculations. It’s more convenient to use, and in the future, we plan to add new capabilities to it, such as WebRTC-based video solutions.

Thanks
Vic

1 Like

@vic.yang Thank you very much for your response