-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch the WebGPU Canvas Interop to be WebGPU first #53
Comments
Hi, This proposal is interesting, but it does look like a mirror version of the Regarding the issues you are bringing up:
Note that in Chromium's current implementation of
This is an issue in both proposal. If we transfer from Canvas2D to WebGPU, we need to configure WebGPU to use the Canvas2D texture format. If we transfer from WebGPU to Canvas2D, we need to make sure the WebGPU texture was created with the right format in the first place or else the Canvas2D won't be able to use it. Because Canvas2D is the more restrictive of the two, it makes sense for the restrictive Canvas2D to create the texture and the more flexible WebGPU to accept it (Postel's law: "Be liberal in what you accept, and conservative in what you send").
I'm curious to hear more about this. Why would the canvas format change on each transfer? Wouldn't that make it even more of a headache if Canvas2D wasn't the one creating the texture? You'd need WebGPU to create textures with a different format on every transfer. Could you clarify?
Can you expand on what you find strange here? Changing the size of the canvas resets the canvas to it's default state, as if newly created. It was the de-facto and only way to reset a canvas before the |
IIUC, there is a proposal to allow canvas to
transferToWebGPU
This seems like it has several issues.
Instead of starting with canvas2d and transfering to WebGPU, would it be better to start with WebGPU and transfer to canvas 2d?
Advantages:
webgpuContext.getCurrentTexture()
if you want a texture displayed in a canvas then transfer it to the context.I'm not saying the above API is perfect. I'm just trying to throw out the ideas that it might be more useful to start at WebGPU instead of starting at Canvas2D. In particular, being able to render to any texture, not just a "canvas texture" seems useful. It also seems like a more interesting model, that a
Canvas2DRenderingContext
is just a API that draws to a texture, any texture, not just canvas blessed textures.The text was updated successfully, but these errors were encountered: