Stream depth video

Is it possible to stream 16 bit depth data over zoom while also streaming rgb.
Im looking into solutions for streaming rgbd data to another pc and then processing that data on their end.

Hi @simon.finnie, thanks for using the dev forum.

Can you provide some more context about your use case? Based on your wording it sounds like you may be asking about Zoom client functionality (in which case, the Zoom Community would be a great resource), but I want to make sure I’m understanding correctly. :slightly_smiling_face:

Thanks!

Hey @jon.zoom,
The goal was to build an application which streams rgb and 16 bit depth data simultaneously from a kinect v2 from one computer to another in realtime (possibly using zoom), and having the application on the second computer read that data and process it.
I believe the video sdk is the correct place to ask about this as it is the only place with raw video access.
Thanks for the help

Hi @simon.finnie,

Thanks for clarifying! In that case, yes I would say that the Video SDK would be the correct solution for this approach. I was unsure since you mentioned wanting to send the data “over zoom”, but this description clears things up. :slightly_smiling_face:

It sounds like your implementation would be primarily based around utilizing the SDK’s raw data feature set. The biggest caveat to be aware of is the fact that all of the video data sent through our SDK is in YUV420p format, so there would need to be some minor transformation required to get it into RGB.

I’m not intimately familiar with how depth data is expressed by a Kinect device, but the command channel could be a great candidate for sending this between SDK instances.

If you have any specific questions about this approach, I’ll be happy to help clarify further.

Thanks!

Hey @jon.zoom ,

Thank you for getting back to me.
So just want to make sure on this.
So the format of the depth video isn’t directly important as it can be converted, but the natural data is a 640x480 frame, with monochrome 16bit data instead of the 8 bit three channel data in colour images.
Presume this is produced concurrently with the rgb at the same framerate. Will using the command channel work for this much data?
The depth data also doesn’t fill the entire image (i.e. some of the pixels in the depth image are blank), if that makes a difference to how the data could be optimized.

One more question, does zoom require the video data to be sent over at any particular frame rate? For example if I wanted to send data at 30, 10, 1 or even 0.1 fps, would that be possible?

Thanks!

Hi @simon.finnie,

Whether or not command channel will work depends on how much data you need to send, and over what period of time it is needed. Do you have a rough idea of frequency or size?

One more question, does zoom require the video data to be sent over at any particular frame rate? For example if I wanted to send data at 30, 10, 1 or even 0.1 fps, would that be possible?

If you are sending raw data through the SDK, I don’t see why this wouldn’t be possible. There will likely be a rate limit if you are trying to send frames too often, but I don’t believe there is a minimum required frame rate.

Thanks!

Hey @jon.zoom

So we are talking 16 bits per pixel and about 307200 pixels so that’s just under 5 megabits per frame. The frequency could be at max 30 hz, but more likely I’d expect 10 hz max. Possibly less if necessary.

Thanks!

Hi @simon.finnie,

Unfortunately that is going to be far too much data for the command channel to process.

Thanks!

Hey @jon.zoom

Yeah I figured as much. Are there any alternatives?
They are 16 bit images so for example it could in theory, be expressed in two channels of a colour image e.g. the red and green channels, but id be concerned about how the image compression techniques you are using might impacting the result.

Thanks!

Hi @simon.finnie,

I don’t think that there would be an alternative available. Depending on circumstances, the resolution of video that is sent can drop down as low as 90p. That would be a massive amount of data loss if your original resolution was anywhere in the HD range.

The biggest challenge for this, which may make it insurmountable on our end, is that it sounds like you’re looking to send an amount of data in excess of what our back end allows. We don’t have a ton of leeway on these restrictions, as easing them could directly result in major increases in the amount of bandwidth consumed per-client. If you have any ideas for a new SDK feature that would be possible without requiring more bandwidth, we can definitely pursue alternatives and/or submit a feature request. Just let me know :slightly_smiling_face:

Thanks!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.