Module:audio/video not enabled

My integration works perfectly on Desktop (Mac/PC) and Android. But on iOS devices, I get randomly Module:audio or Module:video not enabled error when I try to use stream.startAudio() or stream.startVideo().

When I use stream.getCameraList() I get an empty array.

When I call client.getMediaStream(); I can see in my log:
TypeError: a.addStream is not a function. (In 'a.addStream(h)', 'a.addStream' is undefined) — js_media.min.js:1:18795 (js_media.min.js, line 1)

I do have permissions enabled, and I tried from different browsers and devices.

I call those functions on the main thread when the user clicks the “join” button. Can’t understand why it would throw those errors. Sometimes the video does ask for camera, but the audio seems to fail every time. Again - only on iOS.

Hey @treedis ,

Can you please provide steps to reproduce the issue so we can test on our end? Also, what iOS device are you using, and what iOS version?


It happens on any iOS device. But I’m using iPhone SE (2020) with iOS 14.

I have a button that once you press it I call the following function:

join = async () => {
      const signature = generateInstantToken(topic);
      await this.client.join(topic, signature, name); = this.client.getMediaStream();

And then I get the error “Module:audio is not enabled”. Same happens with video. And it is only on iOS. It happens both on localhost and production.

Sorry for the misleading messages on the console. Because of poor support of Audio Worklet on iOS devices, when initializing Instant SDK, audio module is disabled.

So the Instant SDK does not support the audio module on iOS at all? Or is there a way to enable it?
I’m afraid that without support on iOS the whole SDK will be useless for us.

We can not enable audio module until iOS support Audio Worklet, because of the poor quality of audio without this

What about video module? Is it supported in iOS?
This is very disappointing. Not being able to use Audio in iOS makes this whole SDK not usable at all… you should consider enabling it even with poor quality.

Yes, iOS support sending and rendering video, but for video rendering, it only supports the single video.

And this single video rendering is also dependent on Apple? Or is it a limitation that will be lifted by you soon?

It’s the technical limitation of iOS. Rendering multi-video may cause high CPU usage.

Seems weird. With simple P2P and WebRTC we managed to do a 4-way people video chat (4 videos rendering) without any problems (also with audio) on iOS devices.

I understand that Zoom SDK is dependent on iOS technical limitations, but allowing a few people to use video/audio on iOS is still better than not allowing any at all (from our business side). You should consider add this possibility and just add a warning in your documentation that more than X amount of people, it might cause issues. Give the users the freedom to choose. We just can’t use this SDK as of now. Shame.

Hey @treedis ,

Thank you for your feedback and suggestion of having this called out in the docs.

We will update our docs to show this, and once there is no longer a technical limitation of iOS, we will support this. :slight_smile:

I will also talk to the team about allowing the developer to have more flexibility over this. (CS-3257)


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.