If I triggered 'onInitialize', and then call 'sendVideoFrame', can I see YUV frames on the screen?

Description
A clear and concise description of what the question is.

Hello, I’m Yoonha. I want to trigger ‘onInitialize’ callback.

I completed

IZoomInstantSDKSession* pSession = m_pInstantSDK->joinSession(*m_pSessionContext);

and

returnVal_Subscribe=m_pRenderer->Subscribe(m_pUser, RAW_DATA_TYPE_VIDEO, 0);

I’m able to receive YUV frames by triggering ‘onRawDataFrameReceived’ callback.


So, I want to pass those frames to Zoom encoder by 'sendVideoFrame; method.

At this link(Passing Raw Video/Audio to Zoom Encoder on Windows Fully Customizable SDK - #2 by jon.lieblich), there is an advice as ’ Raw video data can be sent through the [ sendVideoFrame ]method. To access this method, obtain an instance of the IZoomInstantSDKVideoSender by providing an instance of IZoomInstantSDKVideoSource in your ZoomInstantSDKSessionContext and listening for the onInitialize callback. The sender parameter is what you are looking for.’.

According to that advice, I declared an instance of IZoomInstantSDKVideoSource and assign it to ‘m_pSessionContext->externalVideoSource’.

And I succeeded to trigger ‘onInitialize’ callback.

Then, if I call ‘sendVideoFrame’ with YUV frames of onRawDataFrameReceived, will the SDK show YUV frames of the screen(show video of a user joining a session)?

Thank you.

Which Desktop Video SDK version?
Knowing the version can help us to identify your issue faster.

To Reproduce(If applicable)
Steps to reproduce the behavior:

  1. Go to ‘…’
  2. Click on ‘…’
  3. Scroll down to ‘…’
  4. See error

Screenshots
If applicable, add screenshots to help explain your problem.

Device (please complete the following information):

  • Device: [e.g. Apple MacBook Pro (13-inch, M1)]
  • OS: [e.g. macOS Big Sur 11.2.1 ]

Additional context
Add any other context about the problem here.

The callback ‘onRawDataFrameReceived’ and the callback’sendVideoFrame’ does not work together. Each one can work separately. What is the reason ans how can I solve it? Should ’ IZoomInstantSDKDelegate’, ‘IZoomInstantSDKRawDataPipeDelegate’, ‘IZoomInstantSDKVideoSource’ be inherited in only one class?

Hey @tlol91,

Thanks for using the dev forum!

On the Windows and MacOS platforms of the video SDK, the video will not be rendered for you. You will have to take the frames and render them yourself for them to be shown on the screen.

You could have one class inherit from all of these if you would like, however, it is not necessary. onRawDataFrameReceived is for receiving the frames from the session, while sendVideoFrame is for sending frames into the session. So if you had 2 users, User A would call sendVideoFrame, then User B should receive that that frame through the onRawDataFrameReceived callback.

Thanks!
Michael

Would you teach me some example codes? I desperately need the example code for rendering after receiving frames at ‘onRawDataFrameReceived’.

Thank you.

If you want my code for the example code, please let me know email address.

Can I use Electron with Video SDK? Or is Electron only available to use with Client SDK? If available, is there any guide for Electron Video SDK?

Hey @tlol91,

For rendering the frames I would recommend using a rendering library. In my test application I use SDL. Normally we do not provide code examples for this since the code depends on what rendering library you choose.

Yes, you can use electron with the Video SDK but you would have to configure it yourself. We do not have an Electron Video SDK, but if you use both the Mac and Windows native SDK’s, you could create an Electron project out of it.

Thanks!
Michael

What is the difference between ‘Electron demo’ and ‘ElectronVideoSDK’?

How about DirectShow? Is it appropriate as rendering library?

Hey @tlol91,

The Electron Demo is a demo application that uses the MacOS Video SDK and the Windows Video SDK. There is not currently a Electron Video SDK.

I havent used it personally, but I believe that would work :slight_smile:

Thanks!
Michael