Capability to stream 2 or more user video + screens simultaneously

Video SDK Type and Version
Video SDK Web v1.10.5

Description
Due to the recent EOL announcement of twilio Programmable video feature. We are evaluating Zoom video-sdk as part of migration plan.

We are stuck at a point where we see that in zoom video-sdk we cannot share more that two streams (one as video stream & one as screens hare) of a user simultaneoustly.
In our use case, user can have multiple screens + video input devices connected and we should be able to identify and share all such screens & video streams simultaneouly.

This use case was possible in case of twilio as they allowed sharing multiple, 2 or more, localTracks (video + screen streams) simultaneously.

// TwilioCode snippet - takes arary of local tracks as input
Twilio.Video.connect(token, {tracks: [{video1}, {video2}, ...]})

However with Zoom SDK when we tried customizing this zoom-websample project we found that in case of zoom there is a single media stream object. that does not allow more than one video + screen share stream

// zoom video-sdk code snippet
client = ZoomVideo.createClient();
client.init()
client.join(token, {topic, userName, signature})
stream = client.getMediaStream()
// I cannot do more than one startScreenShare or startVideo  on this stream instance
stream.startShareScreen()

Does anyone know if we can achieve multistream in zoom video-sdk froma single user?

Hey @chunnukv07 ,

Looks like I may have answered your post over on Stack Overflow:

Could you provide some insights into your use case around needing to share multiple videos so we can evaluate this? What does the sender side experience look like and the remote user experience look like?

Best,
Tommy

@chunnukv07 right now at point of this post, you can use Video SDK Client to send multiple video stream. The Web SDK can receive multiple video streams.

Hi @tommy

We utilize a second video stream for video presentations while the presenter is visible on screen. We attempted to utilize the Screen Share channel, but it only accepts a media device or a second camera ID. It would be logical for the Screen Share channel to also accept a Media Playback File, similar to the video channel. Additionally, participants should be able to pause and unpause the video. Is that functionality currently available?

If not, could you please consider adding this feature? It’s crucial for our adoption. Thank you.

Hey @hrayr ,

Since you are looking to play a video with control over it, and show the presenter, have you considered just screensharing the video with sound and optimizing it for video playback?

Best,
Tommy

Hi @tommy,

Certainly, we do use screen share for video as well. However, our platform serves as a general-purpose virtual events and webinar platform. The specific use case we aim to implement is for a more managed session within a virtual event. Here, video and slide presentations are uploaded beforehand by a producer. Speakers won’t need to concern themselves with sharing their screen; they simply start/pause/stop their video presentations.

Regarding implementation, I don’t believe a full-fledged video player is necessary for the play/pause feature. As long as you allow us to assign a MediaStream for the video and screenshare channels, it should suffice.

Best,
Hrayr

Hey @hrayr ,

Makes sense.

I also wanted to ask why not deliver the video in it’s own player, separate from the Video SDK? I am wondering what the use case is for needing to play the video through the Video SDK itself.

Best,
Tommy

Hi @tommy ,

We’re considering this as our workaround if we can’t find a solution through Zoom video channels.

Here are the issues with this approach:

  1. Synchronizing the player state for all participants may pose challenges. We were considering utilizing Zoom’s command channel if we opt for this route.
  2. Users pausing or seeking ahead might be problematic. While we can disable player controls, this solution may not be entirely reliable; we aim for this to function akin to a live stream.
  3. Echo and feedback loops may occur when users play video with audio on their clients while potentially being unmuted. This situation could lead to echoes similar to everyone speaking at once in a meeting. To mitigate this, if the video were to pass through the local participant’s audio/video channel, Zoom could properly process the audio for noise and echo cancellation. This may also pose an issue for Zoom in determining the active speaker if all participants are playing the same audio slightly out of sync through their speakers into their microphones.

Another workaround we’ve considered (though it’s not ideal) is to swap the video and screen share channels when a user wants to present a video. This means the user’s camera feed would go through the screen share channel via secondaryCameraId, while the video presentation would utilize the video channel’s MediaPlaybackFile.

Is there any reason why Zoom Video SDK doesn’t directly support native MediaStream for both the video and ScreenShare channels? This will allow developers to do their own video processing if needed in addition to passing any compatible video/canvas/camera/screen stream to whatever channel we choose.

Best,
Hrayr

Hey @hrayr ,

Thanks for the additional context.

I was also thinking this and was about to suggest it. Even still you wouldn’t be able to pause/play the video.

We are planning to expose the MediaStream for in an upcoming release.

Would that meet your use case, or would you like us to evaluate the play/pause video file approach?

Best,
Tommy

1 Like