Render recordings from Video raw data

Hi All,
I managed to implement raw recording feature and get access to audio and video data buffers. I understand that audio has mixed stream which can directly save and convert to other formats. We have implemented IZoomSDKRendererDelegate which gives us access to YUVRawDataI420 data structure. Process of combining these videos is not clear, we have sets of method in YUVRawDataI420 class and not sure how to use them. Is there a way we can get access to the mixed stream same way we are doing in Audio raw data access? If not, what are the steps to combining them to a proper template? I can see there are helper classes such as IYUVRawDataI420Converter. Can I use them to combine these streams? How we map video source id to user id? It is great if some one can show me the correct direction.

Thank you !!!

I converted YUV420P buffer to mp4 using the below command.

ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 640*360 -framerate 25 -i 16778240_16778241.yuv -f mp4 16778240_16778241.mp4

The converted video seems YUV planes are not overlapped properly.

Then I tried to extract planes separately and found there is a black stripe between each plane.

ffplay -vf extractplanes=y -f rawvideo -pixel_format yuv420p -video_size 640x360 -i 16778240_16778241.yuv

uplane.PNG

vplane.PNG

Can someone help me to identify the cause of this issue? I think this is related to the padding or Stride. But I don’t have enough information about the format. Its great some can point me to the document with those details.

Thank You!

Hi @sukitha.jayasinghe,

Our video raw data uses YUV420p format. The Wikipedia article should get you pointed in the right direction for deciphering the encoding of each frame. :slightly_smiling_face:

How we map video source id to user id?

When you subscribe to the video data of a user, you provide their user ID in the subscribe method. You can use this to map the raw data to the user.

Thanks!

Hi Jon,

I managed to extract YUV420p images and create videos out of them. But there are some issues related to binary format. If you check the above images, you will be able to see there are black stripes in the U plane and the V plane. I can see this behavior in remote videos but not in the self video.

Thank You!!!

Hi @sukitha.jayasinghe,

This appears to be an issue with how you are rendering the frames, rather than something wrong with the SDK itself. I would recommend checking your renderer against other implementations to find what is going wrong.

Thanks!

Hi Jon,

I am using the below method to access frames from both self-view and remote views.

onRawDataFrameReceived(YUVRawDataI420* data_)

What I can see is every time the self video is perfect, all the remote videos have a color shift like the first image. I can provide frame binary for both the self-views and remote views. I am posting this after validating this multiple times. It’s highly appreciated if you can help with this.

Thank You!!

Ho Jon,

I can confirm that the Screen-share is also free from the above issue.

Thank You !!

Hi @sukitha.jayasinghe,

Can you please send one frame of the raw video data you are receiving so that we can investigate? We would need one frame of what is working correctly and one frame of what is not working. You can send the data through a ticket on the developer support site. Be sure to mention this thread and my name so that the ticket is routed to me.

Thanks!

Hi Jon,

Can you guide me get access to the support channel? Should I buy a premium developer support package?

Thank You!

Hi @sukitha.jayasinghe,

You should be able to access the support site after signing in with your Zoom account. If you are interested in the premier developer support plans, you can find more information by contacting our sales team through here. :slightly_smiling_face:

Thanks!

Hi @jon.zoom,

The ticket is submitted. Please find the ticket URL.

https://support.zoom.us/hc/en-us/requests/13666324

Thank you!!!

Hi @sukitha.jayasinghe,

Thanks for providing that link. Confirming that your ticket has been received. I will let you know as soon as we have any updates.

Thanks!

Will you please share the resolution?

I am having the same issue @sukitha.jayasinghe @jon.zoom.

I stumbled on a workaround.

When it worked like @sukitha.jayasinghe’s screenshot, I was using

// data is YUVRawDataI420;
stream.Write(data.GetBuffer());

If I write this instead, it works as expected.

stream.Write(data.GetYBuffer());
stream.Write(data.GetUBuffer());
stream.Write(data.GetVBuffer());
1 Like

@chump

This workaround works for me aswell. I didn’t get any official response to the ticket. Hope this will be fixed in future releases.

@jon.zoom

Any update?

Thank You!!

2 Likes

Hi Chump,

Same Issue here! Thanks a lot for sharing!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.