drawImage in Camera Mode returning an error

As Drawing images in Camera Mode Documentation in Zoom Apps, , I am trying to draw a image while in camera mode.

Steps to reproduce:

1.Start the rendering context in camera mode:

await zoomSdk.runRenderingContext({
    view: "camera"
})
  1. Attempt to draw the image, eg:
const canvas = document.getElementById("canvas") as HTMLCanvasElement;
const ctx: CanvasRenderingContext2D = canvas.getContext("2d") as CanvasRenderingContext2D;
const img = new Image();

function draw(img: HTMLImageElement) {
    canvas.width = 1280;
    canvas.height = 720;

    // draw image
    ctx.drawImage(img, 0, 0);

    // get image data from canvas
    const imageData: ImageData = ctx.getImageData(0, 0, canvas.width, canvas.height);

    zoomSdk
        .drawImage({
            imageData,
            x: 0,
            y: 0,
            zIndex: 10
        })
        .then((ctx) => {
            console.log("drawImage returned", ctx);
        })
        .catch((e) => {
            console.log(e);
        });
}

img.onload = () => {
    draw(img);
};

img.src =
    "https://9bae041b1da7.jp.ngrok.io/assets/1@2x.jpg?0.0.14?w=164&h=164&fit=crop&auto=format";

This results in the following console error:

Is something missing from the documentation, or is there a bug with this feature?

Thanks!

Are you making sure if zoomSdk.runRenderingContext({ view: "camera" }) is called before you call await zoomSdk.drawImage({})?

drawImage needs camera context for it to draw image.

Yes, Iā€™m sure the zoomSdk.runRenderingContext({ view: "camera" }) is called before I call await zoomSdk.drawImage({}), when I call the zoomSdk.runRenderingContext({ view: "camera" }), I got the error below,

Can you help me find out what is the reason? Thanks you!

Client version: 5.11.3 (8937)

My code is as below:

 const ctx = await zoomSdk.runRenderingContext({
    view: "camera"
 });

 console.log("runRenderingContext returned", ctx);

 zoomSdk
  .drawImage({
    imageData: new ImageData(100, 100),
        x: 0,
        y: 0,
        zIndex: 1
     }).then((ctx) => {
         console.log("drawImage returned", ctx);
     }).catch((e) => {
         console.log(e);
     });

Then I got the following error:

Looking forward to your help, thanks!

Can you try this approach, wait for inCamera app to load first, then postMessage to the sidebar. Then drawImage, you might also have to drawWebView to see the result.

Hi~ Narmada, I have try this approach, but still get an error as below

If you can give me an example, I will be very grateful! :handshake:

Hi, this example should work

function getMyImageData(width, height) {
    const canvas = document.createElement("canvas");
    canvas.width = width;
    canvas.height = height;

    const img = new Image(100, 200);
    img.src = zoomlogo; // insert any image here

    const ctx = canvas.getContext("2d");
    ctx.drawImage(img, 0, 0, width, height);
    return ctx.getImageData(0, 0, width, height);
  }

async function testDrawImage() {
    const imageHeight = 100;
    const imageWidth = 500;
    const imageData = getMyImageData(imageWidth, imageHeight);
    await zoomSdk
      .drawImage({
        imageData: imageData,
        x: 0,
        y: 720 - imageHeight,
        zIndex: 2,
      })
      .then((result) => {
        console.log(result);
      })
      .catch((error) => console.error("Unable to call drawImage ", error));
  }

output:

Hi, Narmada Ravali, thanks for you help, I try it with your example, but got the same error again, Is it because I missed some other configuration?

below is the full code:

import React from "react";
import Button from "@mui/material/Button";
import zoomSdk from "@zoom/appssdk";
import { IMAGES } from "./constants";

const MY_IMAGE = "https://9bae041b1da7.jp.ngrok.io/assets/1@2x.jpg";

const CameraMode = () => {
    const Start = async () => {
        const configRes = await zoomSdk.config({
            size: { width: 480, height: 360 },
            capabilities: [
                // The following are needed for Layers API.
                // Include any other capabilities your app needs here, too.
                `getRunningContext`,
                `runRenderingContext`,
                `closeRenderingContext`,
                `drawParticipant`,
                `clearParticipant`,
                `drawImage`,
                `clearImage`,
                `drawWebView`,
                `clearWebView`,
                `postMessage`,
                `sendAppInvitationToAllParticipants`,
                `onMessage`,
                `onMyMediaChange`
            ]
        });

        console.log("configRes", configRes);

        const renderingCtx = await zoomSdk.runRenderingContext({ view: "camera" });

        console.log("runRenderingContext returned", renderingCtx);

        function getMyImageData(width, height) {
            const canvas = document.createElement("canvas");
            canvas.width = width;
            canvas.height = height;

            const img = new Image(100, 200);
            img.src = MY_IMAGE; // insert any image here

            const ctx = canvas.getContext("2d");

            ctx.drawImage(img, 0, 0, width, height);
            return ctx.getImageData(0, 0, width, height);
        }

        async function testDrawImage() {
            const imageHeight = 100;
            const imageWidth = 500;
            const imageData = getMyImageData(imageWidth, imageHeight);
            await zoomSdk
                .drawImage({
                    imageData: imageData,
                    x: 0,
                    y: 720 - imageHeight,
                    zIndex: 2
                })
                .then((result) => {
                    console.log(result);
                })
                .catch((error) => console.error("Unable to call drawImage ", error));
        }

        testDrawImage();
    };

    return (
        <div>
            <Button
                onClick={() => {
                    Start();
                }}
            >
                Start
            </Button>
            <canvas id="canvas" />
            <img src={MY_IMAGE} style={{ display: "none" }} />
        </div>
    );
};

export default CameraMode;

Are you using a free NGROK account? Also, are you on MAC?

I use a paid NGROK, on my MAC, the Immersive mode is work for me, and other API can also work, except drawImage in Camera Mode

It is work for me now, thanks everyone :grinning:

Hi Wei, what changes did you do? Glad you are unblocked.

Hi Narmada Ravali, I call zoomSdk.config and zoomSdk.runRenderingContext methods in useEffect hook, then draw a image when user select a image. and it can draw success, some code as below:

// App.tsx
useEffect(() => {
    const initial = async () => {
        if (!isZoomApp) {
            return;
        }

        const configRes = await zoomSdk.config({
            size: { width: 480, height: 360 },
            capabilities: [
                // The following are needed for Layers API.
                // Include any other capabilities your app needs here, too.
                `getRunningContext`,
                `runRenderingContext`,
                `closeRenderingContext`,
                `drawParticipant`,
                `clearParticipant`,
                `drawImage`,
                `clearImage`,
                `drawWebView`,
                `clearWebView`,
                `postMessage`,
                `sendAppInvitationToAllParticipants`,
                `onMessage`,
                `onMyMediaChange`,
                `onReaction`
            ]
        });

        await zoomSdk.runRenderingContext({ view: "camera" });
    };

    initial();
}, []);

useEffect(() => {
    let src = currImg[size];
    if (showOverlays && src) {
        drawImg(src).then(({ imageId }) => {
            drawImgRef.current = imageId;
        });
    } else {
        zoomSdk.clearImage({
            imageId: drawImgRef.current
        });
    }
}, [showOverlays, currImg, size]);

// utils/index.ts
const CANVAS_WIDTH = 1280;
const CANVAS_HEIGHT = 720;

const drawImg = (
    src: string
): Promise<{
    imageId: string;
}> => {
    const canvas = document.createElement("canvas");
    const ctx: CanvasRenderingContext2D = canvas.getContext("2d") as CanvasRenderingContext2D;
    canvas.width = CANVAS_WIDTH;
    canvas.height = CANVAS_HEIGHT;

    // coordinate reference adjustment
    let scale = -1;
    ctx.translate(canvas.width, 0);
    ctx.scale(scale, 1);

    const img = new Image();
    img.src = src;
    ctx.drawImage(img, 0, 0, img.width, img.height, canvas.width - img.width, 0, img.width, img.height);

    // coordinate reference restoration
    ctx.setTransform(1, 0, 0, 1, 0, 0);

    const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height);

    return zoomSdk.drawImage({
        imageData,
        x: 0,
        y: 0,
        zIndex: 2
    });
};
1 Like

I am facing the same issues can you please share the file where you made these changes or if possible can you please send the github link for this application

It seems like there might be an issue with the image source or the rendering context in camera mode. Double-check the image URL, and ensure that the rendering context is properly initialized.