We are looking to be able to re-create the Zoom interface on Q-SYS touch panels via an API that would allow us to not be bound to a native Zoom Touch panel. This would allow us to create a far better integrated user experience rather than be limited to injecting limited functionality via Zooms Room Controls via JSON. Does anyone know where to begin with this idea?
You’re on the right track—Zoom doesn’t officially expose its full UI via an open API, but you can achieve a custom integration by leveraging the Zoom Rooms Control System API alongside Q-SYS’s scripting capabilities. Start by using the Zoom Room’s REST API to control core functions (join, leave, volume, etc.), and build a custom interface in Q-SYS using Lua or Block Controller logic. For deeper integration, look into Zoom’s ZCommand over SSH or ZoomOSC as potential workarounds. You won’t get a 1:1 UI clone, but you can craft a clean, functional replica tailored to your environment.
Can you share any documentation on the Zoom Rooms Control System API and implementing with Q-SYS? We are looking to control the meeting that is in a browser on a room PC not an actual Zoom Room Compute device - Just whatever zoom meeting that is in progress on a PC. Is there, otherwise, a lightweight application that can be written to interpolate control between an active meeting in a webRTC browser and a third party control system?