Anyone have a tutorial or sample code that shows how to do this? I’m trying to show both the front and back camera into my zoom session and not able to figure out how to combine the two into one feed to feed into zoom. Any help or pointers would be appreciated!
Anyone tried to do something like this but in iOS? https://stackoverflow.com/questions/77897527/can-zoom-video-sdk-share-multiple-local-video-tracks-like-twilio-programmable-vi
@michael.zoom Do you know any documents or examples on how I can do this in iOS? thanks!
Currently, the Zoom Video SDK for iOS does not provide a built-in way to merge two live camera feeds, like the front and back cameras, into a single stream that you can directly feed into a Zoom session.
The practical workaround is to use Apple’s AVFoundation framework to run both cameras simultaneously with separate AVCaptureSession inputs, then render their outputs into a single UIView or video layer. You can arrange this as a picture-in-picture overlay or side-by-side layout, depending on your design. Once you have your combined view, you can use Zoom’s screen-sharing capability to share that view with other participants. This approach essentially turns your iOS device into a live video mixer, compositing both feeds before sending them out. While it’s not as seamless as having a dedicated API for custom video sources, this method works well if you design the capture and rendering pipeline properly. If Zoom adds direct support for custom video buffers in the future, you could switch from screen sharing to feeding the merged video directly. For now, however, manual composition and screen sharing are the only practical ways to achieve this setup.
Thanks for your reply, So just to be clear the iOS Video SDK cannot do what the web VideoSDK can do with sharing one camera via the shared screen channel as discussed here? https://stackoverflow.com/questions/77897527/can-zoom-video-sdk-share-multiple-local-video-tracks-like-twilio-programmable-vi/77910476#77910476