Custom video source injection in Zoom Web Video SDK

Hello,
I am working with Zoom Web Video SDK inside an Angular web application.

My requirement is to publish custom video content as a meeting video stream.

Specifically, I want to render a dynamic webpage region (document viewer / UI component / canvas output) and send it as a live video track to meeting participants.
I am not trying to share the entire screen, browser tab, or application window.
My requirement is to stream only a specific region of the webpage as video content.

Technically this would be similar to:

  • Using canvas.captureStream() or virtual rendering pipeline

  • Replacing camera video source with a custom MediaStreamTrack

  • Publishing frame buffer output into SDK media pipeline

I understand that browser security policies require user permission for screen capture.

However, I am trying to determine whether there is:

  1. Any supported API in Zoom Web Video SDK to inject a custom video track or stream

  2. Any internal or experimental mechanism (even undocumented) that allows publishing custom frames

  3. Any recommended architectural approach to achieve document/component sharing without invoking browser screen share picker

I also noticed that native Zoom SDK (e.g. Windows SDK) provides APIs such as frame-based sharing (e.g. StartShareFrame()).

Is there any equivalent capability in Zoom Web Video SDK?

Or is publishing custom video source strictly impossible in the Web SDK architecture?

Any official clarification would be very helpful.

Hey @Mary02

Thanks for your feedback.

We have raw data feature similar to the Native SDK. You can refer to our documentation for guidance on how to handle video processing on the video sender side.

And the Github sample app:

Thanks
Vic