ZoomVideoSDKVideoSender.sendVideoFrame not working as expected

Description
I am trying to send raw frames using the sendVideoFrame function of the ZoomVideoSDKVideoSender class. For simplicity I created a side sample app to test this functionality using the 1280 x 720 video file referenced in this post. Specifically, here’s my code for the sending process:

    func startProcessing() {
        self.assetReader?.startReading()
        
        while self.videoIsPlaying > 0 {
            if let sampleBuffer = self.videoTrackOutput?.copyNextSampleBuffer() {
                // Get the YUV buffer data
                if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
                    print(imageBuffer.getFormat())
                    let width = CVPixelBufferGetWidth(imageBuffer)
                    let height = CVPixelBufferGetHeight(imageBuffer)
                    
                    let frameLen = CVPixelBufferGetDataSize(imageBuffer)
                    
                    // Access the YUV data (ensure the memory block is locked for reading)
                    CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags.readOnly)
                    let rawPointer = CVPixelBufferGetBaseAddress(imageBuffer)
                    
                    // Convert UnsafeMutableRawPointer to UnsafeMutablePointer<CChar>
                    let frameBuffer = rawPointer?.assumingMemoryBound(to: CChar.self)
                    
                    // Send the YUV buffer to the video sender
                    sendVideoFrame(frameBuffer, width, height, frameLen, 0)
                    
                    CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags.readOnly)
                    
       
                }
                CMSampleBufferInvalidate(sampleBuffer)
            } else {
                // End of file or an error occurred
                self.videoIsPlaying = -1
            }
        }
    }

    func sendVideoFrame(_ frameBuffer: UnsafeMutablePointer<CChar>?, _ width: Int, _ height: Int, _ frameLen: Int, _ timestamp: Int64) {
        if let sender = frameSender, let buffer = frameBuffer {
            sender.sendVideoFrame(buffer, width: UInt(width), height: UInt(height), dataLength: UInt(frameLen), rotation: .rotationNone, format: .I420)
            print("Frame sent")
        } else {
            print("Sender is not initialized")
        }
    }

and I’m seeing the following logs for each frame when sending:

Pixel format: YUV420 Bi-Planar (full range)
Frame sent

Additionally, I have subscribed each of the 2 participants to the other’s video pipe as follow:
other.getVideoPipe()?.subscribe(with: self, resolution: ._720)

And implemented ZoomVideoSDKRawDataPipeDelegate as follows:

extension ZoomViewController: ZoomVideoSDKRawDataPipeDelegate {
    func onPixelBuffer(_ pixelBuffer: CVPixelBuffer?, rotation: ZoomVideoSDKVideoRawDataRotation) {
        print("Frame received")
    }
    
    func onRawDataFrameReceived(_ rawData: ZoomVideoSDKVideoRawData?) {
        print("Received raw frame ")
    }
}

The problem is, with all steps being successful on the first side, up to the Frame sent log, nothing is happening on the other side, no frames are received, and neither of the log messages in the implemented delegate is being printed.

Which iOS Video SDK version?
1.8.10

Any solution for this? I’m facing the same issue, tried the sample video too and got nothing

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.