How can I integrate the live captioning feature into my client app?

I found that we can enable the closed captioning feature and live transcription service. After that, when i host the meeting with zoom client app, i can start the live transcription and realtime caption is shown upon the video.
Now, i want to integrate this feature into my own app with the zoom sdk integraion(ios, android and web). Is it possible? How to do it?

Hi @reamer,

Unfortunately the closed captions and live transcript is not currently available through the Mobile SDKs.

The only API method that relates to this is the recording-transcript-completed webhook, which is sent as soon as the transcript recording has been saved. You can use this hook to download the final transcript, however it will only be available after the recording has completed, so it probably won’t be of much use to you.

I have dome a little research to see if any of the Closed Caption APIs allow you to send events to webhooks, and one that does is https://webcaptioner.com, maybe it is worth taking a look at them to see if you can implement their API into your Zoom Account, and then into your Mobile App.

Sorry I could be much more help!
Alex

Hey @reamer,

Thanks for using the dev forum!

On iOS you can show/hide the closed captions with closeCaptionHidden in MobileRTCMeetingSettings, and you can receive a closed caption message in MobileRTCMeetingServiceDelegate using the callback:
- (void)onClosedCaptionReceived:(NSString * _Nonnull)message;

Using the WebSDK you can only toggle closed captions on and off through isSupportCC.

Close captioning is not supported on Android currently.

@alexmayo Thank you so much for your help :slight_smile:

Thanks!
Michael

Thanks for you input very much! @Michael_Condon
While, is it possible to start or stop the closed captions with SDK by the host user now?

Hi @Michael_Condon, I added -onClosedCaptionReceived but it seems like it only receives messages if the Closed Caption was made by typing it manually, the captions from the Live Auto Transcription was not received.

Hey @reamer

Not on iOS, unfortunately.

Thanks!
Michael

Hey @azanton

Nice to see you again :slight_smile:

That is correct, the auto transcriptions are not supported by the iOS SDK in this function.

Thanks!
Michael

1 Like

That is correct, the auto transcriptions are not supported by the iOS SDK in this function.

@Michael_Condon, does Zoom have a schedule to support it and in android?

Hey @reamer,

It is on the roadmap for android to match iOS in closed captioning features, and I will alert the team that iOS should support close captioning similar to windows SDK.

Unfortunately, I do not have a timeline for this.

Thanks!
Michael

Hey @Michael_Condon, Any update/help for implementing the live caption feature on the Android SDK custom UI?

Hi @ashutosh.singh,

No updates on this yet. Be sure to keep an eye on our release notes. :slightly_smiling_face:

Thanks!

@jon.zoom Is there a timeline for livetranscript for Android SDK?

Hi @meir698, thanks for using the dev forum.

I assume you are asking about support for live transcription in custom UI mode? If so, we should be adding support for this in the next SDK release. We don’t have an exact timeline for this release yet, so be sure to keep an eye on the release notes linked in my last reply for the most up-to-date information. :slightly_smiling_face:

Thanks!

hello @jon.zoom @Michael_Condon

seems live transcription is supported in Andorid SDK v5.9.0 already?
Is there any guide/reference about how we should integrate it, especially in the Custom UI mode?

Thank you

Hi @kevinxiao,

You are correct that live transcription support for custom UI mode was added in v5.9.0. Unfortunately we do not yet have documentation around how to implement this feature, aside from the list of newly added methods in the SDK. We are working on adding this in the future.

Thanks!