Unknown errors when integrating Zoom Video SDK and a Next JS project

All integrations were made following the Zoom Video SDK documentation.
I’m having some unsolved issues in the documentation, I’ll list them in a numbered format below with the details:

1 - Video from participant’s cameras does not appear even though the camera initialization and attachment to a container is successful.
Below is the excerpt cited in the documentation I used:

stream.startVideo().then(() => {
stream.attachVideo(client.getCurrentUserInfo().userId, RESOLUTION).then((userVideo) => {
document.querySelector(‘video-player-container’).appendChild(userVideo)
})
})

In my code I created a function to display the cameras of the other participants, creating a container for each one who enters the session:
const startParticipantVideos = async (bVideoOn, userId) => {
if (!isJoined || !stream) return;

    console.log(`DEV TEST. CONTAINER DO USER ${userId}`, `#participantBox-${userId}`);

    if (bVideoOn && userId !== currentZoomUser?.userId) {
        let attempts = 0;
        const maxAttempts = 5;
        const interval = 1000;

        while (attempts < maxAttempts) {
            try {
                const videoContainer = document.querySelector(`#participantBox-${userId}`);

                if (!videoContainer) {
                    console.error(`Contêiner de vídeo não encontrado para o participante: ${userId}. Tentativa ${attempts + 1}`);
                    await new Promise(resolve => setTimeout(resolve, interval));
                    attempts++;
                    continue;
                }

                // Verifica se já existe um `video-player` no contêiner
                const existingVideoPlayer = videoContainer.querySelector('video-player');

                if (existingVideoPlayer) {
                    console.warn(`Vídeo do participante ${userId} já está anexado.`);
                    break;
                }

                // Se não existe um `video-player`, anexar o vídeo
                const userVideo = await stream.attachVideo(userId, 3);

                videoContainer.appendChild(userVideo);
                console.log(`Vídeo do participante ${userId} anexado ao contêiner.`);
                break;
            } catch (error) {
                if (error.reason === "Camera is starting, please wait.") {
                    attempts++;
                    console.warn(`Tentativa ${attempts} para anexar vídeo do participante ${userId} falhou: Câmera ainda está iniciando.`);
                    await new Promise(resolve => setTimeout(resolve, interval));
                } else {
                    console.error(`Erro ao anexar vídeo do participante ${userId}:`, error);
                    break;
                }
            }
        }

        if (attempts === maxAttempts) {
            console.error(`Falha ao anexar o vídeo do participante ${userId} após ${maxAttempts} tentativas.`);
        }
    } else {
        console.log(`Participante ${userId} não tem vídeo ativo.`);
    }
};

I don’t have any error return in the console, but even so, the video doesn’t appear, just a black box in its place.

2 - Host is unable to close user’s cameras. Below is the excerpt cited in the documentation I used:
stream.stopVideo().then(() => {
stream.detachVideo(USER_ID)
})

In my code I created the following function that receives the stream and the ID of the user I want to stop the camera from:
const stopVideo = async (mediaStream, userId) => {
setHiddenCamera(true)
setVideoStarted(false);
mediaStream.stopVideo().then(() => {
mediaStream.detachVideo(userId)
})
}
I always get an error returned in the console (camera is closed)

3 - Sharing the screen is unstable, sometimes it displays the screen, sometimes it returns an error in the console and does not display.
An error is displayed in the console but no reason is given
{
“type”: “INVALID_OPERATION”,
“reason”: “”
}

Below is more information about the versions and system I am using:
Video SDK version: 1.12.5
React version: 18
Next JS version: 14.2.2
OS: Windows 11
Browser: Google Chrome
Browser Version 129.0.6668.70 (Official Build) (64 bits)

Hi @dev_lxpead

If you’re rendering a remote video, you don’t need to call startVideo:

client.on("peer-video-state-change", (event) => {
  if (event.action === 'Start') {
    const mediaStream = client.getMediaStream();
    const userVideo = await mediaStream.attachVideo(event.userId, VideoQuality.Video_360P);
    videoContainer.appendChild(userVideo);
...

Do you have SAB enabled? I see you’re trying to render multiple 720p streams. You can render one 720p video on a webpage at a time. For example, in a speaker view, render the speaker video at 720p and self view and other users’ videos at 360p or less. I think this can be causing the other issues you’re seeing with video not rendering correctly.


it’s a bit difficult to offer help without looking at how the code is used, what errors you’re seeing in the console. I can see a few issues from the snippets you shared, but I’d recommend looking through this reference app that’s built with Next.js to better understand any framework specific integration: GitHub - zoom/VideoSDK-Web-Telehealth: This starter-kit uses the Zoom Video SDK to build a telehealth app on Web
Here’s a simple hello-world type application that would also be helpful: GitHub - zoom/videosdk-web-helloworld

1 Like

Thanks for the reply! SAB is not configured and I am already arranging it to test it. I also configured it to render the participants’ cameras in 360p.

I also cloned the sample application and am analyzing it.

I will close this topic because I was instructed by the Zoom team to create one for each error.