Having trouble with event handler for "peer-video-state-change"

I’ve followed the directions in the SDK to set up the canvas render code as such:

this.client.on('peer-video-state-change', async (payload) => {
    const { action, userId } = payload;
    if (action === 'Start') {
          await this.stream.renderVideo(canvas,userId,width,height,x,y,quality)
    } else if(action==='Stop'){
        await stream.stopRenderVideo(canvas,userId);

I’ve found that, after successful .init(), .join() and .getMediaStream(), the above code inside the event handler simply never runs. (tested this with unique console.log() statements all over the code & found that it never reaches the interior of the .on() assignment) This is in testing with a client app with only one user.

How can I trigger this in normal use? Are there any properties of the init and join responses that I should inspect to see if there is a problem that would restrict the client from having a “peer-video-state-change” event? Does this event simply never fire unless more than one user joins the connected session?

Web Video SDK version 1.0.3
Running in Angular 11.0.2, Testing in latest Google Chrome


Hey @brianWD

Thanks for sharing your questions about Video SDK.

peer-video-state-change event will be triggered when other users start or stop the video, please use client.on to listen to events once you get the client instance, to ensure you will not miss any events.


Ah. So this is the ideal event to use in a production scenario, but is there a client object event that can be used in early development stages that would work when only one user is connected to the meeting? E.g. how do I show the video feed from my own computer with one user connected?

Because this event is being watched & it’s never triggering, I can only confirm audio and camera activation but I can’t render a video stream. I could ultimately attempt to set up a second computer, camera and microphone with our local dev setup (API and front-end) just to sit connected as a second user, but if that can be avoided in the early dev stages it would be preferred.

Hey @brianWD
There are two options:

  1. Wait for the stream.startVideo() to resolve, then call stream.renderVideo() to render your own video. Note this will send the stream to the ZOOM server.

  2. Use MediaDevices.getUserMedia() method provided by the browser, it will render video from the local video track. MediaDevices.getUserMedia() - Web APIs | MDN


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.