Are there any end-to-end tutorials for integrating the custom UI into an iOS application? It’s been difficult to follow the iOS SDK documentation or follow the code from the github sample app. I’m working in Swift with storyboards so I’m a bit used to a different workflow.
When I looked up “custom UI” in the devforum, these were part of the top results:
Neither point towards documentation or a tutorial. One suggests that custom UI is a paid feature, so I wonder if there is a separate dev forum for paid features?
Anyways, I’d love to integrate the custom UI but I’m looking for more detailed documentation.
Thanks for using Zoom SDK. Custom UI is available to all SDK user now. As long as you have a non-free license and have a valid SDK key pair, you should be able to use Custom UI. (And there is no separate dev forum)
Regarding Custom UI, our doc and our sample code would be the best resource:
We understand that you are developing using Swift, which is different from the Objective-C that we are using in both our doc and demo. Currently, our SDK does not explicitly support Swift, but based on the interoperability between Objective-C and Swift, our SDK should work with Swift as well.
We are improving our documentation as well as our demo app. Please post any questions here in the forum and we will provide assistance to help you integrate our SDK into your app.
Thank you very much for your response! I’m using a bridging header right now and it’s really simple to get setup without the custom UI.
Is there any way to grab video frames directly? I’m hoping to incorporate some augmented reality overlays into the video but haven’t found a clear way to do so.
Thanks for the reply. Do you mean the raw video stream data? That’s a really cool use case by the way. Unfortunately, we do not have any interfaces for any kind of raw data at this point. I will make it as a feature request, and let our engineer know about this request. Please follow our Github repo for any updates.
Yeah, the raw video stream data would be perfect! I’m hoping to use iOS’s AR libraries that have stabilized and get great performance on top of the Zoom data.
I’ll follow the Github repo for updates, would love to test anything out if that makes it into a release in the future!
One other question: I just upgraded from the Basic plan to the Pro plan to use the custom UI. Will that change be immediately reflected? Can I begin using the custom UI?