I have been trying to fetch the list of participants QoS information out of multiple meetings but running out of the API threshold then as per my/ our meeting with one of the Zoom contacts I have been asked to use API endpoint (https://api.zoom.us/v2/metrics/meetings/123456/participants/qos?) which should return the summary of the participant QoS so that I avoid the API threshold issue but this API endpoint is returning the participant QoS with every minute timestamps and still facing the API thresold issues.
I am currently working on your issue. I will respond to you as soon as I have any updates
Hi,
Can you please let me know which endpoints are you calling and how often are you calling?
This will help us understand your requirements so that we can give you specific recommendations
Ojus
- If I am not wrong, you should be able to find the end point which I am trying to consume
https://api.zoom.us/v2/metrics/meetings/123456/participants/qos/…?
- I am running synchronous loop for fetching ~2K meeting participant’s QoS info; a meeting shall have 7-10 participants
Apologies for the delayed response.
We will have webhooks available by Q3, which would resolve the issue of QOS threshold .
I will update this thread once the feature is available.
Please let me know if you have any questions.
Thanks,
Ojus
The only want you can get the summary data for EACH participant in the meeting is to use the Meeting ID in the meeting.started
webhook event (docs link) and then Long-Poll the List Meeting Participant’s QOS API
To prevent hitting the rate limit, I would not recommend more than one sample per meeting every 5 minutes), so for a 30 minute meeting you would have 6 API requests total.
Let’s use 100 Meetings occurring simultaneously as an example (assuming they all begin at the exact same moment). The Rate Limits of 16 requests per second to this API mean that if we have a controller out front, managing the long-polling of these requests, you would only be making one API Request every second for 1 minute and 40 seconds (but remember…you can make 16 requests/second, so you should be able to handle 10X the number of meetings, so 1000 simultaneous meetings, PLUS you haven’t even filled the 5 minute long-poll frequency recommendation I stated earlier meaning in theory you could have 2-3 times that resulting in 2K-3K simultaneous meetings occurring and retrieving very useful samples for ALL meeting participants over the entire course of the meeting. You just have to push that data off to be aggregated and analyzed as you wish.
If you don’t want to long-poll this system, you can also have a process running that receives the meeting.ended
events, which then makes requests to the List Meeting Participants API and then for each of those participants, you can Retrieve a Single Meeting Participant’s QoS Data, but of course, this means you won’t be able to make as many API requests, thus you have to rethink your strategy a bit.
Employing this second strategy, means you have to keep count of each meeting.ended
events, and the of participants per meeting for yourself. Why…?
(1 API Request to Get List of Participants
- 1 API Request for EACH Participant to GET the Participant’s QoS Data
) X Total # of Meetings.
Every time you get a meeting.ended
event, you’ll need to make a request to get the LIST of Meeting Participants, and place those meeting participants into another service’s queue. All that other service does is GET the Individual Participant’s QoS data, but it has to know how to properly THROTTLE itself based on the number of requests you have available at any given minute. This API Throttle service must always know exactly how many Zoom API Requests you’ve made, so it can either send more or fewer requests to fill up your rate limits to within a reasonable threshold (making sure to leave yourself a little wiggle room).
Remember, the API Rate Limits at Zoom are currently calculated at the account level, NOT at the App+User level as you may expect. So you cannot create multiple apps that can run in parallel to circumvent this restriction. This may change in the future, but that’s the way it is for now.