Custom camera and rendering

Hi dear Zoom,
I downloaded your SDK and found it great.

I have 2 questions:

  1. How can I use my camera implementation instead of what you deliver in the SDK?
  2. How can I add my custom rendering? I need to process and draw every local and remote frame.

Guys, thank you so much for your great SDK.

Look forward to receiving your replies

– Gleb

Hi @gleb.prischepa, thanks for using the dev forum.

Unfortunately we do not yet provide the ability to customize the camera input feed through the Android Client SDK. Our recently released Fully Customizable SDK provides raw audio/video data access out of the box and may be worth considering. If you have any questions regarding that SDK, please post over in #fully-customizable-mobile-sdk:android and we’ll be happy to assist. :slightly_smiling_face:


Thank you so much for your quick response.
Just to clarify. The answers :
1 - not now
2 - it is possible for local and remote inputs.
Am I correct?

Ср, 24 февр. 2021 г. в 9:02 PM, Jon via Zoom Developer Forum <>:

Hi @gleb.prischepa,

Apologies for the confusion, but the answer to both of the questions in your original post would be the same. This is not yet supported by the Client SDK, but would absolutely be possible with the Fully Customizable SDK.



1 How can I use my camera implementation instead of what you deliver in the SDK?
SDK have interface ZoomSDKVideoSourceHelper#setExternalVideoSource can use external video source, you can capture video and send by by yourself.

2 How can I add my custom rendering? I need to process and draw every local and remote frame

ZoomSDKRenderer can subscribe rawdata and render by yourself.

Smaple code:us.zoom.sdksample.inmeetingfunction.customizedmeetingui.RawDataMeetingActivity

Before use rawdata make sure you have RawData License.ZoomSDK#hasRawDataLicense


Hi Fred,
thank you so much for your explanation.

Could you please explain how to use my custom camera implementation using “Fully Customizable SDK” ?

Thank you in advance

Hi @gleb.prischepa,

The Fully Customizable SDK can do this through sending raw data. Please see our documentation here and if you have any additional questions regarding that SDK feel free to create a new post in #fully-customizable-mobile-sdk:android .


Hi Zoom support,
I have read all docs from as you recommended and found answers to half of my questions.
Thank you so much for your suggestions.

Guys, I would be very glad to hear a few more answers to my questions.

  1. Camera
    I have not found much information regarding camera and its customizations.
  • Is your camera module based on If not, please let me know what is used.
  • I found that ZoomInstantSDKVideoSender can be used to process and send local camera frames but I have not found a way to process and render local frames from the camera. Let’s say I want to apply the AR effect and show it to the user while using the camera. How can it be achieved using your API?
  • How can I send texture instead of buffers in void sendVideoFrame(java.nio.ByteBuffer frameBuffer, …)?
  • What is default FPS and how can it be adjusted?
  1. Custom design UI
    I have found this doc and would like to clarify it with you.
    Does that mean using MobileRtcVideoView I can implement any UI design I want i.e. place my custom buttons(with custom icons) anywhere on the screen and add custom click handlers/listeners?

  2. Functional SDK/API
    Does your sdk provide API without UI components and camera?
    For instance, Agora provides the sdk that allows to create a video session, send-receive video frames and a bit more but no camera and UI components.

Look forward to seeing your replies.

Thank you in advance
– Gleb

Hi @gleb.prischepa,

As mentioned in my previous reply, please post over in #fully-customizable-mobile-sdk:android so that we can assist. This category is only meant for the client SDK. :slightly_smiling_face:


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.