Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

readRenderTargetPixels provides semantically incorrect behavior #21934

Open
jceipek opened this issue Jun 1, 2021 · 2 comments
Open

readRenderTargetPixels provides semantically incorrect behavior #21934

jceipek opened this issue Jun 1, 2021 · 2 comments
Labels

Comments

@jceipek
Copy link

jceipek commented Jun 1, 2021

Describe the bug

TLDR: the implementation and interface of readRenderTargetPixels assumes that a WebGLContext's readPixels function accepts the format and type of the renderTarget's texture. This is not true per the spec or in practice, and causes errors when trying to read pixels from a render target's texture when the gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_FORMAT) and gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_TYPE) do not match the render target texture's format and type.


Per the spec here https://www.khronos.org/registry/webgl/specs/latest/1.0/#5.14.12
readPixels reads from the frame buffer and accepts two possible combinations for format and type:

  1. Format RGBA and type UNSIGNED_BYTE
  2. An implementation-defined format and type

In case 2, there is no guarantee that the format and type will line up with the format and type of the texture attached to the bound frame buffer. However, per the source code here

this.readRenderTargetPixels = function ( renderTarget, x, y, width, height, buffer, activeCubeFaceIndex ) {
, readRenderTargetPixels uses the texture type and format of the render target's texture as the values it passes to the underlying WebGLContext's readPixels function. If they don't line up, readRenderTargetPixels logs an error to the console and returns without filling the passed buffer.

For a practical example of where this is a problem, consider a texture with format RGBA_INTEGER, internal format RGBA16UI, and type UNSIGNED_SHORT. Some implementations choose RGBA_INTEGER and UNSIGNED_SHORT as the implementation-defined color read format and type, in which case everything works. Others choose RGBA_INTEGER and UNSIGNED_INT (for example, Firefox 89.0b15 on my 2019 16-inch MacBook Pro with MacOS 11.2.3), which produces the error 'THREE.WebGLRenderer.readRenderTargetPixels: renderTarget is not in UnsignedByteType or implementation defined type.'

Expected behavior

I'd like some help determining what the ideal form of readRenderTargetPixels should be to properly support implementation-defined reading and would be happy to open a PR if someone could help me come up with a good interface for it.

There's a few things that make this tricky to get right:

  1. The fact that readPixels only allows implementation defined formats means that browsers are free to do whatever they want. Firefox on MacOS and Linux appear particularly prone to define read formats/types that are different from other browsers. I haven't yet discovered a case where the format differs based on GPU, but that is a distinct possibility.
  2. The buffer that readPixels accepts must match the type parameter. For example, gl.UNSIGNED_INT requires a Uint32Array while gl.UNSIGNED_SHORT requires a Uint16Array.
  3. The implementation-defined read format and type vary based on the texture attached to the currently bound framebuffer.

From an API user perspective, the only way to use readPixels correctly with an implementation-defined read format and type is to query the format and type and use the result to decide what size buffer to use. In practice, I think the implementation will always choose a format that is sufficiently large to obtain the requested information, but extracting the data you actually care about if the implementation defined a wider format or type than that specified in the texture you're reading from is tricky. For example, some implementations only support reading from a texture with format RED_INTEGER, internal format R32UI, and type UNSIGNED_INT using a read format of RGBA_INTEGER (vs RED_INTEGER in most implementations) and type UNSIGNED_INT. The user of the readPixels function needs to know to supply a Uint32Array sufficiently large to hold the results and needs to know how to interpret the data.

Platform:

  • Device: All devices
  • OS: All OSes
  • Browser: All browsers
  • Three.js version: [dev, r129]
@Mugen87 Mugen87 added the Design label Jun 1, 2021
@mrdoob
Copy link
Owner

mrdoob commented Jun 16, 2021

Any chance you can put together a PR with a solution for this?

@jceipek
Copy link
Author

jceipek commented Jun 21, 2021

Any chance you can put together a PR with a solution for this?

@mrdoob I'd be happy to, but as I mentioned in the issue:

I'd like some help determining what the ideal form of readRenderTargetPixels should be to properly support implementation-defined reading and would be happy to open a PR if someone could help me come up with a good interface for it.

I don't know what a good API for this would look like, and I'm not sure what to do with the existing API interface, which is fundamentally incompatible with the way that readPixels actually works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants