I’m trying to send a video to my bot’s image after he joins a meeting.
Created a yuv file from a single image file.
Here is my implementation to send the frames:
void PlayVideoFileToVirtualCamera(IZoomSDKVideoSender* video_sender, const std::string& video_source) {
int frame_rate = 25;
int frame_duration_ms = 1000 / frame_rate;
std::ifstream file(video_source.c_str(), std::ios::binary);
if (!file.is_open()) {
std::cout << "Failed to open file: " << video_source << std::endl;
return;
}
// Calculate the frame size for I420 format
std::cout << "width: " << width << " and height: " << height << " of the sending video" << std::endl;
int frameSize = width * height * 3 / 2;
// Allocate buffer for one frame
char *frameBuffer = new char[frameSize];
std::cout << "Sending the image " << std::endl;
auto start_time = std::chrono::steady_clock::now();
file.read(frameBuffer, frameSize);
while (true) {
video_sender->sendVideoFrame(frameBuffer, width, height, frameSize, 0);
std::this_thread::sleep_for(std::chrono::milliseconds(frame_duration_ms));
}
std::cout << "finished sending image" << std::endl;
// Clean up
delete[] frameBuffer;
file.close();
}
It sometimes works and sometimes not. totally randomly.
When I’m using Ipad for a meeting it shows the image.
When I’m using the mac zoom client it doesn’t present the image.
After some further debugging.
this call returns no errors:
When I’m on mac zoom client, the support_cap_list length is 0.
When I’m on Ipad zoom client, the “onStartSend” called and the thread is strated. (Not always the sending frames works but this is a separate question), but never used the “turnOn” command, what is it ?
Could you specify which Linux SDK sample you’re using? Since there are two different ones available, it would be helpful to know the specific SDK in order to better understand what might be happening.
Understood, so as I showed in the previous comment, I craeted a CheckAndStartRawSendingBot method, called in the main thread, but unmuteVideo returns error code 11. why is that?