Unfortunately we do not yet provide the ability to customize the camera input feed through the Android Client SDK. Our recently released Fully Customizable SDK provides raw audio/video data access out of the box and may be worth considering. If you have any questions regarding that SDK, please post over in #fully-customizable-mobile-sdk:android and we’ll be happy to assist.
Apologies for the confusion, but the answer to both of the questions in your original post would be the same. This is not yet supported by the Client SDK, but would absolutely be possible with the Fully Customizable SDK.
1 How can I use my camera implementation instead of what you deliver in the SDK?
SDK have interface ZoomSDKVideoSourceHelper#setExternalVideoSource can use external video source, you can capture video and send by by yourself.
2 How can I add my custom rendering? I need to process and draw every local and remote frame
ZoomSDKRenderer can subscribe rawdata and render by yourself.
The Fully Customizable SDK can do this through sending raw data. Please see our documentation here and if you have any additional questions regarding that SDK feel free to create a new post in #fully-customizable-mobile-sdk:android .
I found that ZoomInstantSDKVideoSender can be used to process and send local camera frames but I have not found a way to process and render local frames from the camera. Let’s say I want to apply the AR effect and show it to the user while using the camera. How can it be achieved using your API?
How can I send texture instead of buffers in void sendVideoFrame(java.nio.ByteBuffer frameBuffer, …)?
Functional SDK/API
Does your sdk provide API without UI components and camera?
For instance, Agora provides the sdk that allows to create a video session, send-receive video frames and a bit more but no camera and UI components.
As mentioned in my previous reply, please post over in #fully-customizable-mobile-sdk:android so that we can assist. This category is only meant for the client SDK.