I’m reaching out because we’re facing a problem in our iOS app that uses the Zoom Meeting SDK. We also have features that need the microphone through SFSpeechAudioBufferRecognitionRequest. But we found an issue with iOS 17. When a user joins a call and mutes their audio, SFSpeechAudioBufferRecognitionRequest stops working. The same thing happens if they leave the call while still muted.
Can you help us understand how to use the microphone correctly when a user is muted during a call?
There are some updates that may help to understand the situation.
We took a demo project using meeting SDK and implemented voice command recognition.
Our results are:
the bug is reproduced on a real device with iOS 17
the bug is not reproduced on the simulator with iOS 17
the bug is not reproducible with iOS versions up to 17
when you turn off the sound in zoom on iOS 17 and our service is running, the sound in the system is completely turned off (for example, there is no sound in Instagram)
when you turn off the sound in zoom on iOS 17 and there is no our service, the sound works in the system
Apparently, something has changed in iOS 17, and now the way a user mutes their microphone in Zoom when there is a service that works with a microphone results in muting the microphone and audio. Perhaps zoom uses some low-level code that now works differently on iOS 17.
We will be glad to receive any ideas and suggestions.