Hi All,
I managed to implement raw recording feature and get access to audio and video data buffers. I understand that audio has mixed stream which can directly save and convert to other formats. We have implemented IZoomSDKRendererDelegate which gives us access to YUVRawDataI420 data structure. Process of combining these videos is not clear, we have sets of method in YUVRawDataI420 class and not sure how to use them. Is there a way we can get access to the mixed stream same way we are doing in Audio raw data access? If not, what are the steps to combining them to a proper template? I can see there are helper classes such as IYUVRawDataI420Converter. Can I use them to combine these streams? How we map video source id to user id? It is great if some one can show me the correct direction.
Can someone help me to identify the cause of this issue? I think this is related to the padding or Stride. But I don’t have enough information about the format. Its great some can point me to the document with those details.
Our video raw data uses YUV420p format. The Wikipedia article should get you pointed in the right direction for deciphering the encoding of each frame.
How we map video source id to user id?
When you subscribe to the video data of a user, you provide their user ID in the subscribe method. You can use this to map the raw data to the user.
I managed to extract YUV420p images and create videos out of them. But there are some issues related to binary format. If you check the above images, you will be able to see there are black stripes in the U plane and the V plane. I can see this behavior in remote videos but not in the self video.
This appears to be an issue with how you are rendering the frames, rather than something wrong with the SDK itself. I would recommend checking your renderer against other implementations to find what is going wrong.
I am using the below method to access frames from both self-view and remote views.
onRawDataFrameReceived(YUVRawDataI420* data_)
What I can see is every time the self video is perfect, all the remote videos have a color shift like the first image. I can provide frame binary for both the self-views and remote views. I am posting this after validating this multiple times. It’s highly appreciated if you can help with this.
Can you please send one frame of the raw video data you are receiving so that we can investigate? We would need one frame of what is working correctly and one frame of what is not working. You can send the data through a ticket on the developer support site. Be sure to mention this thread and my name so that the ticket is routed to me.
You should be able to access the support site after signing in with your Zoom account. If you are interested in the premier developer support plans, you can find more information by contacting our sales team through here.