I want to render multiple videos across all devices, and I’ve enforced this by setting enforceMultipleVideos=true in the client.init method. After that, I create a video-player-container and attach multiple video-player elements to it.
However, on an iPad Air 2 running iOS 15.6.1 with Safari, even after initializing the video client, stream.isSupportMultipleVideos still returns false. As a result, only one video is displayed at a time, with videos flashing between each other.
Is this a bug in the VideoSDK, or do you have any suggestions for rendering multiple videos? (I’m considering is creating multiple video-player-containers, each with a single video-player)
However, in versions after 1.12.0, we’ve made optimizations that allow support for rendering up to 2 videos.
I attempted to upgrade to SDK versions 1.12.0 and 1.12.1 and tried rendering two video-player instances within the same video-player-container. However, the issue persisted, with only one video being displayed and videos flashing between each other.
This is my meeting session id: “eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhcHBfa2V5IjoiN3ZBUExTZ3FuZkp2dTN5S1dVZlVHaUE5M0NmVlZsMzZVOThwIiwicm9sZV90eXBlIjowLCJ0cGMiOiI0NjU4NzMzMDI4IiwidmVyc2lvbiI6MSwidXNlcl9pZGVudGl0eSI6IjQ3MjAwNTg5NDkiLCJzZXNzaW9uX2tleSI6IjQ2NTg3MzMwMjgiLCJpYXQiOjE3MjQyMzE0NDMsImV4cCI6MTcyNDI0MjI0M30.e955CJsvxMkuNJGOf3ZZp4PbkwRpdvlR-t-L-oa7TWE”
We analyzed the log, the platform doesn’t support OffscreenCanvas API while SharedArrayBuffer is enabled. So it meets an unsupported scenario.
A temporary solution is to disable SharedArrayBuffer, which will prevent us from using Offscreen Canvas to render the video.
Sorry for the many current limitations. We are working on a new video solution in beta, based on WebRTC Video, which will eliminate constraints and be easier to work with. It will be released around October.