Windows Meeting SDK, Video RawData Recording Frames Saved on Disk but file is not Playable

I am able to receive raw frames and successfully saved those to disk. But not able to play the saved file. After applying the below FFmpeg command showing the below error and encoder is also encoding the whole recorded video. That encoded video is playable but that is very odd. Please any suggestion or direction or anything I am implementing wrong here.

ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 640*360 -framerate 25 -i c234e4b1-4803-42cf-a41e-010417414492-video.yuv -f mp4 output.mp4

virtual void onRawDataFrameReceived(YUVRawDataI420* data) {
	videoFile = fopen(videoFilePath.c_str(), "a+b");

	fwrite(data->GetBuffer(), data->GetBufferLen(), 1, videoFile);

	/*fwrite(data->GetYBuffer(), data->GetStreamHeight() * data->GetStreamWidth(), 1, videoFile);
	fwrite(data->GetYBuffer(), data->GetStreamHeight() * data->GetStreamWidth() / 4, 1, videoFile);
	fwrite(data->GetYBuffer(), data->GetStreamHeight() * data->GetStreamWidth() / 4, 1, videoFile);*/

	fclose(videoFile);
}

Here is a video screenshot:

Welcome!

Thank you for posting @freelancer.nak! It looks like the issue is related to your FFmpeg command and not Zoom. First, can you share the following details:

  1. Which Windows Meeting SDK version?

    Knowing the version can help us to identify your issue faster.

  2. What have you tried to resolve this behavior you are seeing with FFmpeg? ( i.e Stackoverflow thread)

  3. Are you only seeing this FFmpeg error only when encoding a Zoom meeting recorded?

@donte.zoom Thanks for the quick reply.

Below are the details:

1. zoom-sdk-windows-5.11.1.6653
2. I have done a lot of R&D regarding that and also already tried many answers including the above shared StackOverflow thread.
3. My problem is to play onRawDataFrameReceived saved video.

As I already asked for any suggestions on how we can save Raw Data to Disk. Or If any fix with my logic then please share that also.

@donte.zoom
ffmpeg -video_size 1920x1080 -r 25 -pixel_format yuv420p -i 4d2cde4f-2d1f-4939-aac1-13687b2cb4f5-video.yuv -vf yadif stockholm_deInt.yuv

Thanks for the screenshot @freelancer.nak ! In your testing, have you been able to successfully execute the FFmpeg command without receiving that Invalid argument error? The first thing I’d do is resolve the invalid argument error. This will help isolate whether the source of the behavior you are seeing when attempting to play the video is due to the onRawDataFrameReceived saved video or the FFmpeg command.

@donte.zoom FFmpeg command is working fine when all the saved raw frames have the same heightwidth. Invalid argument error only happens when frames are of different heightwidth in the same saved file, please review the below logs to get some idea.

2022-08-05 21:35:12 => MeetingId: 89533431781 Connecting…
2022-08-05 21:35:18 => MeetingId: 89533431781 InMeeting…
2022-08-05 21:35:18 => Recording: Recording Started…
2022-08-05 21:35:19 => Audio Raw Meta: 32000-1
2022-08-05 21:35:19 => RawData ON…
2022-08-05 21:35:19 => Raw Video Resolution: 640360
2022-08-05 21:35:21 => Raw Video Resolution is Changed from: 640
360 to: 480270
2022-08-05 21:35:26 => Raw Video Resolution is Changed from: 480
270 to: 640360
2022-08-05 21:35:29 => Raw Video Resolution is Changed from: 640
360 to: 480270
2022-08-05 21:35:34 => Raw Video Resolution is Changed from: 480
270 to: 640360
2022-08-05 21:36:44 => RawData OFF…
2022-08-05 21:36:44 => MeetingId: 89533431781 DisConnecting…
2022-08-05 21:36:45 => MeetingId: 89533431781 MeetingEnded…
2022-08-05 21:36:45 => Recorded Video Resolution: 640
360

@donte.zoom Hi, finally I resolved my issue myself. Sharing solution that may be save others time.

Ffmpeg Helper Header file:

extern “C”
{
#include <libswscale/swscale.h>
#include <libavutil/avutil.h>
#include <libavutil/imgutils.h>
#include <libavutil/opt.h>
#include <libavutil/frame.h>
#include <libavutil/mem.h>
#include <libavcodec/avcodec.h>
#pragma comment (lib,“avcodec.lib”)
#pragma comment (lib,“avutil.lib”)
#pragma comment (lib,“swscale.lib”)
}

class FFmpegHelper
{
public:
AVFrame* allocPicture(enum AVPixelFormat pix_fmt, int width, int height);
AVFrame* frameConversion(char* y, char* u, char* v, int srcWidth, int srcHeight);
};

Ffmpeg Source File:

#include “ffmpeg-helper.h”

AVFrame* FFmpegHelper::allocPicture(enum AVPixelFormat pix_fmt, int width, int height)
{
// Allocate a frame
AVFrame* frame = av_frame_alloc();

if (frame == NULL)
{
	fprintf(stderr, "avcodec_alloc_frame failed");
}

if (av_image_alloc(frame->data, frame->linesize, width, height, pix_fmt, 1) < 0)
{
	fprintf(stderr, "av_image_alloc failed");
}

frame->width = width;
frame->height = height;
frame->format = pix_fmt;

return frame;

}

AVFrame* FFmpegHelper::frameConversion(char* y, char* u, char* v, int srcWidth, int srcHeight)
{
SwsContext* resize;
int width = 640;
int height = 360;

AVFrame* src = allocPicture(AVPixelFormat::AV_PIX_FMT_YUV420P, srcWidth, srcHeight);
AVFrame* dst = allocPicture(AVPixelFormat::AV_PIX_FMT_YUV420P, width, height);

src->data[0] =(uint8_t*) y;
src->data[1] =(uint8_t*) u;
src->data[2] =(uint8_t*) v;

resize = sws_getContext(
	srcWidth,
	srcHeight,
	AVPixelFormat::AV_PIX_FMT_YUV420P,
	width,
	height,
	AVPixelFormat::AV_PIX_FMT_YUV420P,
	SWS_LANCZOS | SWS_ACCURATE_RND,
	NULL,
	NULL,
	NULL);

int response = sws_scale(
	resize,
	src->data,
	src->linesize,
	0,
	height,
	dst->data,
	dst->linesize);

return dst;

}

Here is recording saving

videoFile = fopen(videoFilePath.c_str(), “ab”);

		auto height = data->GetStreamHeight();
		auto width = data->GetStreamWidth();

		AVFrame* yuvScaled = ffmpegHelper.frameConversion(data->GetYBuffer(), data->GetUBuffer(), data->GetVBuffer(), width, height);

		auto yPlanar = 640 * 360;
		auto uPlanar = (yPlanar / 4);
		auto vPlanar = (yPlanar / 4);

		fwrite(yuvScaled->data[0], yPlanar, 1, videoFile);
		fwrite(yuvScaled->data[1], uPlanar, 1, videoFile);
		fwrite(yuvScaled->data[2], vPlanar, 1, videoFile);

		fclose(videoFile);
2 Likes

Awesome, @freelancer.nak! Glad you were able to figure out the solution. Can you share more context on how you arrived at the solution you’ve shared above? This will help anyone else who has the same issue that comes across this at a later date.

I figured it out from ‘Invalid Argument’ ffmpeg command error. When raw video file have same widthheight frames then ffmpeg command is working fine. Then realised and tried to implement sws_scale to scale all the frames with same widthheight and boom above shared code is now working fine.

2 Likes

Is currently Windows Native SDK ( zoom-sdk-windows-5.11.1.6653) supporting ScreenShare RawData recording?

@donte.zoom Please reply → Here

@freelancer.nak Yes, ScreenShare RawData recording is supported.

I have subscribed RawData

rawRender->subscribe(participant, ZoomSDKRawDataType::RAW_DATA_TYPE_SHARE);

but unable to receive frames. Any suggestion or doc?.

Thanks.

@freelancer.nak ,

Here is our documentation for Raw Data. Please double-check local recording permission for the current user.

@donte.zoom I fixed that. Wrong way subscribed ScreeShare Raw. I have subscribed that inside onSharingStatus(status, userid) and things are working fine now.

Is there any way to merge and build single video from multiuser video streams or any layout thing we can subscribe inside SDK?

Thanks.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.