INVALID_OPERATION: no audio joined while muting user

Description
I have microphone permissions and I await startAudio after starting meeting. It unmutes the user. After that if I try to mute user, 1/10 times it gives me an error, INVALID_OPERATION: no audio joined. How can I confirm if audio is joined before muting the user ?

Browser Console Error
{type: ‘INVALID_OPERATION’, reason: ‘no audio joined’}

Which Web Video SDK version?
Video SDK- 1.5.5

Video SDK Code Snippets

const muteLocalTile = useCallback(async () => {
    try {
      await zmStream.current.muteAudio();
    } catch (e) {
     console.log(e)
    }
  }, []);

  const startAudio = useCallback(async () => {
    if (isSafari) {
      if (audioDecode.current && audioEncode.current) {
        await zmStream.current.startAudio();
        isAudioStarted.current = true;
      } else {
        logger.info('desktop safari audio init has not finished');
      }
    } else {
      await zmStream.current.startAudio();
      isAudioStarted.current = true;
    }
  }, [isSafari, logger]);

  const startMeeting = useCallback(
    async ({ meetingInfo }) => {
      try {
        await zmClient.current.join(meetingInfo.sessionName, meetingInfo.token, name, meetingInfo.passcode);
       } catch (error) {
          console.log(e) 
     }
      zmStream.current = zmClient.current.getMediaStream();
      if (!isSafari) {
        try {
          await startAudio();
        } catch (e) {
          console.error(e);
        }
      }
      await validateConfigOnMeetingInit();
      try {
        zmClient.current.getAllUser().forEach(user => {
          if (
            user.bVideoOn &&
            user.userId !== zmClient.current.getCurrentUserInfo().userId &&
            remoteCanvasRef.current
          ) {
            zmStream.current?.renderVideo(remoteCanvasRef.current, user.userId, 960, 540, 0, 0, 2);
          }
        });
      } catch (e) {
        console.error(e);
      }
    },
    [ isSafari, name, startAudio, validateConfigOnMeetingInit]
  );

  const validateConfigOnMeetingInit = useCallback(async () => {    
    if (isAudioStarted.current && !micEnabled) {
      try {
        await muteLocalTile();
      } catch (e) {
        updateConfiguration({ type: ACTION_TYPES.UNMUTE });
      }
    }
  }, [ updateConfiguration, muteLocalTile, micEnabled]);

To Reproduce(If applicable)
The issue is intermittent, happens on starting a meeting with mute configuration.

Screenshots
NA

Troubleshooting Routes
NA

Device (please complete the following information):

  • Device: MacBook Pro
  • OS:12.2.1 (21D62)
  • Browser: Safari
  • Browser Version: 15.3 (17612.4.9.1.8)

Additional context
sessionId: “bPNiByxHQFeUNViiCAAVJw==”
participantId: 83887104

@sprinklrqa4 ,

Seems like there is a possibility out of the 1/10 times, the startAudio() has not completed yet.

An alternative implementation would be to use * event_current_audio_change to listen for users to successfully join audio before proceeding to mute them

Hi
I have the same problem as the author of this topic.

According to the documentation, startAudio starts this way ( i.e. I need to wait for some events):

await client.init();
await client.join(topic, signature, username, password);
const stream = client.getMediaStream();
await stream.startAudio();

There is another version of the implementation for safari. I used to have it implemented this way, but problems began to arise with this method too:

// Safari browser is different:

let audioDecode, audioEncode;
// wait until the encoding and decoding process is ready for the audio
client.on("media-sdk-change", (payload) => {
   const { action, type, result } = payload;
   if (type === "audio" && result === "success") {
     if (action === "encode") {
       audioEncode = true;
     } else if (action === "decode") {
       audioDecode = true;
     }
     if (audioDecode && audioEncode) {
       try {
         // start audio automatically in Safari
         stream.startAudio({ autoStartAudioInSafari: true });
       } catch (err) {
         console.warn(err);
       }
     }
   }
 });

 // Start audio in 'click' callback in Safari
 joinAudioButton.addEventListener("click", () => {
   if (audioDecode && audioDecode) {
     try {
       stream.startAudio();
     } catch (err) {
        console.warn(err);
     }
   }
 });

But the problem with this method is that after disconnecting from the session and reconnecting, the necessary events (media-sdk-change) do not come, but only a part of them comes.

Each time the threads are initialized, 2 events (encoding and decoding) should occur in the session for each of the threads: audio, video, screen sharing. A total of 6 events should come, but when reconnecting to a session without reloading the page, only 4 come, and there are not even any events with audio data.

I tried both of these methods but could not find an optimal solution to this problem.
In both cases, I wait for startAudio to execute and, upon successful completion, I call muteAudio:

await stream.startAudio();
await stream.muteAudio();

A participant on macbooks regularly 1/5 this problem. They just enter the conference and their microphone cannot be automatically muted.
image

Web Video SDK 1.6.0

@chunsiong.zoom, hi!
My logs show that this trigger is triggered even before startAudio occurs. Are you sure this is how it should be?
image

In addition, I have tested the method you proposed, but the problem, unfortunately, remains =(

Hey @efron.vit @sprinklrqa4

Thanks for your feedback.

We will improve the timing of startAudio and muteAudio in the next version.

Thanks
Vic

1 Like

Hi @vic.yang

Does this item in the changelog relate to this issue ?

CHANGELOG v1.6.2

Enhanced

  • Timing of the Promise resolve call returned by the stream.startAudio method

Hey @efron.vit

Yes. We improved the timing of stream.startAudio. It’s the root cause of this issue.

Thanks
Vic

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.