readRenderTargetPixels
provides semantically incorrect behavior
#21934
Labels
readRenderTargetPixels
provides semantically incorrect behavior
#21934
Describe the bug
TLDR: the implementation and interface of
readRenderTargetPixels
assumes that a WebGLContext'sreadPixels
function accepts the format and type of the renderTarget's texture. This is not true per the spec or in practice, and causes errors when trying to read pixels from a render target's texture when thegl.getParameter(gl.IMPLEMENTATION_COLOR_READ_FORMAT)
andgl.getParameter(gl.IMPLEMENTATION_COLOR_READ_TYPE)
do not match the render target texture'sformat
andtype
.Per the spec here https://www.khronos.org/registry/webgl/specs/latest/1.0/#5.14.12
readPixels
reads from the frame buffer and accepts two possible combinations forformat
andtype
:RGBA
and typeUNSIGNED_BYTE
In case 2, there is no guarantee that the format and type will line up with the format and type of the texture attached to the bound frame buffer. However, per the source code here
three.js/src/renderers/WebGLRenderer.js
Line 1962 in 370504b
readRenderTargetPixels
uses the texture type and format of the render target's texture as the values it passes to the underlying WebGLContext'sreadPixels
function. If they don't line up,readRenderTargetPixels
logs an error to the console and returns without filling the passed buffer.For a practical example of where this is a problem, consider a texture with format
RGBA_INTEGER
, internal formatRGBA16UI
, and typeUNSIGNED_SHORT
. Some implementations chooseRGBA_INTEGER
andUNSIGNED_SHORT
as the implementation-defined color read format and type, in which case everything works. Others chooseRGBA_INTEGER
andUNSIGNED_INT
(for example, Firefox 89.0b15 on my 2019 16-inch MacBook Pro with MacOS 11.2.3), which produces the error'THREE.WebGLRenderer.readRenderTargetPixels: renderTarget is not in UnsignedByteType or implementation defined type.'
Expected behavior
I'd like some help determining what the ideal form of
readRenderTargetPixels
should be to properly support implementation-defined reading and would be happy to open a PR if someone could help me come up with a good interface for it.There's a few things that make this tricky to get right:
readPixels
only allows implementation defined formats means that browsers are free to do whatever they want. Firefox on MacOS and Linux appear particularly prone to define read formats/types that are different from other browsers. I haven't yet discovered a case where the format differs based on GPU, but that is a distinct possibility.readPixels
accepts must match thetype
parameter. For example,gl.UNSIGNED_INT
requires aUint32Array
whilegl.UNSIGNED_SHORT
requires aUint16Array
.From an API user perspective, the only way to use
readPixels
correctly with an implementation-defined read format and type is to query the format and type and use the result to decide what size buffer to use. In practice, I think the implementation will always choose a format that is sufficiently large to obtain the requested information, but extracting the data you actually care about if the implementation defined a wider format or type than that specified in the texture you're reading from is tricky. For example, some implementations only support reading from a texture with formatRED_INTEGER
, internal formatR32UI
, and typeUNSIGNED_INT
using a read format ofRGBA_INTEGER
(vsRED_INTEGER
in most implementations) and typeUNSIGNED_INT
. The user of thereadPixels
function needs to know to supply aUint32Array
sufficiently large to hold the results and needs to know how to interpret the data.Platform:
The text was updated successfully, but these errors were encountered: