Using attach on 2.1.10 version

I am updating ZOOM WEB SDK to 2.1.10, but without success.

Always I got the attached error.

The code.

try {
await mediaStream.attachVideo(
remoteParticipant?.userId,
VideoQuality.Video_720P,
videoPatientTeste.current,
);

HTML

 <>
        <video-player-container>
          <video-player class="video-player" ref={videoPatientTeste} />
        </video-player-container>
      </>

On App.tsx

type CustomElement = Partial<T & DOMAttributes & { children: any }>;

declare global {
interface Window {
webEndpoint: string | undefined;
zmClient: any | undefined;
mediaStream: any | undefined;
crossOriginIsolated: boolean;
ltClient: any | undefined;
logClient: any | undefined;
}
// eslint-disable-next-line @typescript-eslint/no-namespace
namespace JSX {
interface IntrinsicElements {
[‘video-player’]: DetailedHTMLProps<
HTMLAttributes,
VideoPlayer
> & { class?: string };
[‘video-player-container’]: CustomElement & {
class?: string;
};
}
}
}

Hey @josemaurodl

Thanks for your feedback.

Could you share the error you encountered?

Thanks
Vic

Can I add a print screen ?

Hi @josemaurodl

Could you share some problematic session IDs with us for troubleshooting purposes?

You can retrieve them using the client.getSessionInfo(). sessionId method.

Thanks
Vic

const teste = await mediaStream.attachVideo(
remoteParticipant?.userId,
VideoQuality.Video_720P,
videoPatientTeste.current,
);

      console.log(
        'zmClient.getSessionInfo().sessionId',
        zmClient.getSessionInfo().sessionId,
      );
      console.log('remoteParticipant?.userId', remoteParticipant?.userId);
      console.log('videoPatientTeste.current', videoPatientTeste.current);
      console.log('teste', teste);

zmClient.getSessionInfo().sessionId WGncwgZTQxSMwzeNpUL0yA==
remoteParticipant?.userId 16779264


Hi @josemaurodl

WGncwgZTQxSMwzeNpUL0yA==

After analyzing the logs, we found that when using stream.attachVideo to render a remote video, if the remote user hasn’t started their video yet, you’ll encounter the 'user is not send video' error. You can check the bVideoOn property of the user object to determine if the user is sending video.

As for the error Expected to accept HTMLCanvasElement or HTMLVideoElement, but actual it is null, it’s likely due to using the stream.renderVideo method without passing the correct parameter — though the code snippet isn’t complete, this is our best guess.

For video rendering, attachVideo is sufficient and preferred — its functionality overlaps with renderVideo, so there’s no need to use both.

Thanks
Vic

const remoteCameraOn = useCallback(async () => {
    console.log('videoPatientTeste', videoPatientTeste);
    if (mediaStream && videoPatientTeste.current && isStarted) {
      if (remoteParticipant?.userId !== participantUserID) {
        await mediaStream.detachVideo(participantUserID);
        sendLogs({
          action: 'stop-video',
          info: {
            remoteID: remoteParticipant?.userId,
          },
        });
      }
      const adjustVideo = async () => {
        await mediaStream.adjustRenderedVideoPosition(
          canvasRef.current as HTMLCanvasElement,
          remoteParticipant?.userId || participantUserID,
          canvasDimension.width,
          canvasDimension.height,
          0,
          0,
        );
        sendLogs({
          action: 'adjust-video',
          info: {
            canvasDimension,
          },
        });
      };
      if (remoteParticipant?.bVideoOn) {
        try {
          const teste = await mediaStream.attachVideo(
            remoteParticipant?.userId,
            VideoQuality.Video_360P,
            videoPatientTeste.current,
          );

          console.log('videoPatientTeste', videoPatientTeste);
          // console.log(
          //   'zmClient.getSessionInfo().sessionId',
          //   zmClient.getSessionInfo(),
          // );
          console.log('remoteParticipant?.userId', remoteParticipant?.userId);
          console.log('videoPatientTeste.current', videoPatientTeste.current);
          console.log('teste', teste);
          console.log('video-player', document.querySelector('video-player'));
          console.log(
            'video-player-container',
            document.querySelector('video-player-container'),
          );

          await adjustVideo();
        } catch (e) {
          adjustVideo();
          sendLogs({
            action: 'error-render-video',
            info: {
              error: e,
            },
          });
        }
      }
      setRemoteCameraDisabled(false);
    }
  }, [
    mediaStream,
    isStarted,
    participantUserID,
    sendLogs,
    canvasDimension,
    remoteParticipant?.userId,
    remoteParticipant?.bVideoOn,
    videoPatientTeste,
  ]);

Still cant see the video.

I’d like to add somethings.

1 - My application is a telemedicine solution and the error that I am getting is from the doctor side, but the patient side(APP React Native) still using 1.12.14 version.

2 - Using styled component.

3 - The element is created on the DOM, but is not showing.

export const VideoPlayerContainer = styled('video-player-container')`
  flex: 1;
  width: 100%;
  height: 100%;
  border: 3;
  border-style: 'solid';
  border-color: 'black';
`;

export const VideoPlayer = styled('video-player')`
  flex: 1;
  width: 100%;
  height: 100%;
  aspect-ratio: 16/9;
  display: block;
  position: absolute;
  top: 0;
  left: 0;
  right: 0;
  bottom: 0;
`;
 <div
        style={{
          border: 3,
          borderColor: 'red',
          borderWidth: 4,
          borderStyle: 'solid',
          width: '100%',
          height: '100%',
        }}
      >
        <p>TESTE</p>

        <S.VideoPlayerContainer>
          <S.VideoPlayer class="video-player" ref={videoPatientTeste} />
        </S.VideoPlayerContainer>
      </div>


Hi @josemaurodl

Thanks for sharing the detailed code snippets with us.

In the code, both stream.attachVideo and stream.adjustRenderedVideoPosition are used. If you’re using video-player, the video position is adjusted automatically, so additional API calls aren’t necessary.

Regarding the styling issue with video-player, could you create a test app and share it with us for further investigation?

Thanks
Vic

So, you can use the the videosdk-web-sample from ZOOM GitHub.
Just add the styled-components and use.




import styled from 'styled-components';

export const VideoTeste = styled('video-player')`
  display: block;
  position: absolute;
  top: 0;
  left: 0;
  right: 0;
  bottom: 0;
  background-color: red;
  border-radius: 10;
`;

export const DivTeste = styled.div`
  display: block;
  position: absolute;
  top: 0;
  left: 0;
  right: 0;
  bottom: 0;
  border-color: red;
  border-radius: 10;
  background-color: green;
`;

 <>
                          <VideoTeste
                            class="video-player"
                            ref={(element) => {
                              setVideoPlayerRef(user.userId, element);
                            }}
                          />
                          <DivTeste>
                            <p>TESTE</p>
                          </DivTeste>
                        </>

Hi @josemaurodl

Thanks for sharing the code snippets with us.

export const VideoTeste = styled('video-player')`
  display: block;
  position: absolute;
  top: 0;
  left: 0;
  right: 0;
  bottom: 0;
  background-color: red; // remove this property
  border-radius: 10;
`;

Cannot set the background-color because the video is rendered on a canvas inside the shadow DOM of the video-play-container. If a background color is applied, it will cover the video.

Similarly, also can not overlay a div element on top of the video-player, as it will cover the video rendered within the shadow DOM.

Thanks
Vic

Man. It’s an example. It’s not real code.

This topic has been solved