Sharing custom video

Something that’s not clear to me in the SDK is whether it’s possible to share custom video rather than the camera. I know that screen sharing is an option, but what if I wanted to share just a single UIView? I’m much more of a Swift developer so I’m having a slightly hard time with the Objective C, but it looks like it might be possible using the screen sharing API? Anyone try anything like this?
thanks,
michael

Hello @mji83,

Good question and thanks for using the dev forum!

Before providing some sample code, I just wanted to make sure I am understanding your use case. Are you trying to have a user share a video with the meeting, or is the desired behavior to replace a user’s live camera stream with something like a video?

Thanks!
Michael

Hi Michael,
Thanks for the response. Actually, either would work. I’m trying to understand what is possible. Think of it this way: I’m pulling in video from an external camera connected to the phone (not one of the built-in cameras). I’d like to be able to share it with other people in a Zoom meeting. I could do that with screen share and not write a single line of code, but it seems like there must be a cleaner way of getting the frame buffer directly into Zoom.
Thanks!
michael

Hello @mji83,

There is not a very direct way to accomplish this, unfortunately.
However, with some work I think it would be possible!

The screensharing framework provided in this SDK can broadcast your entire screen.

If you implemented a custom meeting UI, you could create a custom view containing the feed you would want. Then by sharing your screen, the view would be broadcasted to the meeting. It would not be providing the view directly to the SDK, it would be screensharing your whole screen, which contains your custom view.

Finally, using AVFoundation you could set up your external camera to provide the video data to your custom view: https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture.

If you decide to go that route, we would be more than happy to provide some swift code snippets for the Zoom SDK side of things :slight_smile:

Hope that helps! Let me know if you have any more questions!
Michael

Hi Michael,
I’d be interested in giving it a try. Some Swift sample code would be extremely helpful. Since I don’t need to share other applications screens (just the screen of the app I’m developing), I’m also wondering if this impacts what the user must do to initiate the share. For example, sharing one’s screen in the standard Zoom app is a bit involved and requires initiating the screen share from outside of the application. Does this still hold true or is more automation possible since I’m just trying to share my own zoom app’s screen with itself?
Thanks,
Michael

Hi @mji83,

It is not automated and is very similar to how the standard Zoom app behaves. Apple’s security practices limit how screensharing can behave. If you do explore this option, I would recommend getting the screensharing to behave correctly in your app before implementing a custom meeting UI because the custom UI is a bit involved.

How to get screensharing implmented:
Once you have installed the frameworks into your app, follow the steps for creating a screenshare extension. These instructions are in objective-c but will also work in swift. After creating a broadcast extension, xcode will create a file called SampleHandler.swift. This file is what controls the screensharing behavior.

To interact with Zoom’s SDK, SampleHandler must conform to the protocol MobileRTCScreenShareServiceDelegate like this:

class SampleHandler: RPBroadcastSampleHandler, MobileRTCScreenShareServiceDelegate {

    let screenShareService = MobileRTCScreenShareService()

    override init() {
        super.init()

        screenShareService.delegate = self
        screenShareService.appGroup = "your group id"
    }

    override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) {
        screenShareService.broadcastStarted(withSetupInfo: setupInfo)
    }
    
    override func broadcastPaused() {
        screenShareService.broadcastPaused()
    }
    
    override func broadcastResumed() {
        screenShareService.broadcastResumed()
    }
    
    override func broadcastFinished() {
        screenShareService.broadcastFinished()
    }
    
    override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        screenShareService.processSampleBuffer(sampleBuffer, with: sampleBufferType)
    }

    func mobileRTCScreenShareServiceFinishBroadcastWithError(_ error: Error!) {
        finishBroadcastWithError(error)
    }
}

For screen sharing to be enabled in the SDK you must also implement the MobileRTCMeetingServiceDelegate.onClickShareScreen(_ parentVC: UIViewController)

Here is an example of this using a ViewController

import MobileRTC 

class ViewController: UIViewController {
    override func viewDidLoad() {
        super.viewDidLoad()
        
        if let meetingService = MobileRTC.shared().getMeetingService() {
            meetingService.delegate = self
        }
    }
}

extension ViewController: MobileRTCMeetingServiceDelegate {
    func onClickShareScreen(_ parentVC: UIViewController) {
        print("share screen button tapped")
    }
}

Thanks!
Michael

I cannot find this MobileRTCScreenShareServiceDelegate in sampleHandler.swift. Need help

Hey @prazwolsbasnyat

Thanks for using the dev forum!

Have you imported the MobileRTCScreenShare framework?

Thanks!
Michael