-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Add a texture coordinates test #82
Comments
It might be better to have the coordinates stretch only partway (for example from (0,0) to (0.3, 0.3), such that the min/max values make it clear which section is being asked to render. |
Then, maybe, something like this (Edit: And maybe dedicated images, only ranging from (0,0) to (0.25, 0.25) ?!) including some test cases with (I'd also include a few words about that in the "glTF Overview for glTF 2.0", but also only iff it can be wrapped into a few, clear, unambiguous statements) |
This issue might eventually be covered by #89 , but maybe I'll add a dedicated, minimalistic texture coordinates test nevertheless. (Note: In the screenshot that @pjcozzi posted, the texture coordinates that are displayed in the texture are once more flipped compared to the latest state of the discussion here. And I think that at least one of the original sketchfab glTF test models suffered from flipped textures in glTF 1.0, which they somehow corrected by using a custom |
@javagl Does my new sample file help straighten this out? In particular, in the screenshots above, you've got the wrong coordinates. The glTF (1.0 and 2.0) UV origin is in the upper-left, not the lower-left. If you were to create a model where the XY positions matched the UV positions exactly, then yes the image would appear upside-down on the polygon. |
This, indeed, was the crucial point in the initial question. So the texture will appear to be flipped. This was already the case in glTF 1.0, but the spec updates might have caused some confusion here ( KhronosGroup/glTF#1021 ) and I wondered whether this behavior was supposed to change in 2.0. The sample file should certainly help, but admittedly, I did not yet have the chance to try it out. (The And by the way, if I understood this correctly, then one of the key messages of the spec update was that WebGL-based implementations will have to use |
WebGL implementations of glTF set Desktop OpenGL implementations must manually "flip", either by vertically flipping each incoming texture image, or by applying a |
That depends on how an application loads image data. Let's say that the app uses OpenGL and PNG decoder outputs image with upper-left origin (most likely). In that case, the app could flip pixels manually or tweak shaders. Otherwise, if PNG decoder outputs image with lower-left origin (maybe it has a flag for such case) - no additional action is required. Btw, the same logic should apply to OpenGL ES. |
So this is how I would integrate this in the Overview:
(I'd say that the implementations should flip the image after loading it. Flipping the texture coordinates in the shader would probably raise the question about |
Please, remove "Desktop-" part since "OpenGL or OpenGL ES" is quite clear (maybe we should also add small-print trademark info in the bottom of overview). |
Great to see those nice examples!
Sorry, don't want to confuse everyone, but... I currently think that, usually, the OpenGL renderer still won't have to flip anything, and the same holds for the WebGL renderer. As also pointed out by @lexaknyazev (KhronosGroup/glTF#1021 (comment)):
OpenGL assumes that the order of the rows in the texture data is "bottom-to-top". See the documentation of glTexImage2D:
Here's an example to illustrate what I mean: Is this drawn correctly? In this case, we just have to make sure to load the first texel from the upper left part of the input image, that's all. But maybe I'm also wrong - didn't try this with real code. Just assuming that, in both APIs, the default orientation of the texture data and the default orientation of the UVs will be consistent. In this case, as can be seen from the example, we are safe in any case and need no flip. Compare with my initial example of the Collada Duck: This is a collada asset, with the texcoord origin in the lower left corner (as shown on the screenshot). Here, the OpenGL-based renderer must flip, because the image origin is upper left, but the OpenGL texcoord origin is lower left, meaning: the UVs of the model do not match the orientation of the image data. So, the bottom line is, as @lexaknyazev pointed out, that the UV origin must correspond to the first texel, and if an image is loaded one has to make sure that this really the case. One could change the wording accordingly, leaving out the statement that an OpenGL-based renderer would always have to flip and instead just saying that
Does that make sense? |
These two bullet points sound good to me for the spec. As for discussion, isn't
technically the same as flipping when OpenGL is used and image decoder outputs upper-left pixel first? |
Not if my drawing above is correct and the direction of the UV coords is matching - if both, order of the texel rows and direction of UV coords, are consistent, nothing needs to be flipped. Does someone know if the drawing is correct? Meaning: Is the orientation of the UV coords in both cases (OpenGL and WebGL) consistent with the order of the texel rows, and are the UV and texture data orientations shown in the picture correct? |
This bottom line is (still) to blurry (for me... sorry about that). What exactly does this mean? That is, how does one "make sure" that the UV origin corresponds to the first texel? When loading an image, one can assume to receive some representation of pixels, usually some sort of (Side note: I'll probably remove this part from the overview. No statement here is better than a statement that is confusing (or even wrong)) |
At the risk of muddying things further, typical "raw WebGL" users might not be aware of which texel is the "first" texel. In the reference implementation repo, we ask the browser to load the texture image, and we set that image as the sampler. There's no handling of individual texels, and no indication of which texel might be "first" or "last." For WebGL users I'd like to avoid the "first texel" language and just say that the top-left of the image is the origin. Even for non-WebGL apps, glTF images are encoded in JPG or PNG, not raw texels. A desktop image decoder could decode those formats and yield pixels in whatever order. But the image formats intrinsically convey a sense of what part of the image is the top of the image, and that's where glTF declares its origin. |
No, this is the point - see my previous comment quoting from the API documentation of glTexImage2D. For OpenGL, it's from bottom to top, matching the first texel with the UV origin in the lower left corner. |
@mlimper Your example seems to be right as long as image decoder output matches API expectations. |
Trying to break it down differently, there are three aspects:
As long as 1) is top-to-bottom and 2) and 3) are consistent with each other (top-to-bottom or bottom-to-top), no flip is necessary (that's what I wanted to illustrate). The confusion seems to originate from the fact that on the one hand, with OpenGL, the separation between aspects 1) and 2) is very obvious and clear, while WebGL allows to pass higher-level image objects directly into the API function, thereby handling 1) and 2) within a single step.
Agreed |
I think we're still too much in the weeds. Pixel order is device and platform-specific. PNG and JPEG clearly define which way is right-side-up in their encoded images. The top-left of the image is the origin. That's it. Don't mention the pixel order or the API. |
This may indeed be the point. What I said referred to the fact that the data that you receive from an image loader will usually be "in reading order". When blindly setting these values as pixel values on the screen, the image will be displayed properly. And as you pointed out in your last comment (correctly, as far as I can say this from the spec and other comments) : OpenGL assumes that the Edit: After creating a minimal example that just renders the textured unit square with Desktop OpenGL, I am (personally) convinced that the texture has to be flipped. |
OK with that.
Yes, this is correct. Just saying if you already ship the UV coords with your glTF asset, you should not flip the texture in an OpenGL application, as everything will be alright in that case (both will be "upside down", texture data and UV coords, so the result is that everything looks correct again). Sorry for the cumbersome discussion. So, let's just don't mention any API / flipping, as this would be misleading. |
In other words, aligning image decoder output with used API is out of glTF scope, right? If so, I agree: let's mention only origin's location. |
Similar to the TextureSettingsTest, there should be a "texture coordinates test".
Of course, in some way, this will be covered by all other models that involve textures, but I think that it might be nice to have a dedicated, minimalistic test case for this distressingly common issue.
In order to avoid any further confusion:Can anybody confirm that
putting this texture on a unit square
using the texture coordinates matching the vertex positions, as shown in the image itself (!)
rendering the quad with with the default camera
will result in the quad appearing exactly like the texture?
(Particularly: Is it right that the texture will not appear to be flipped ?)
The text was updated successfully, but these errors were encountered: