-
Notifications
You must be signed in to change notification settings - Fork 362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for viewdirection space in hardware shading #2036
Add support for viewdirection space in hardware shading #2036
Conversation
@edobrowo This looks really promising, and to resolve the EasyCLA issue, I'd recommend following up through the support link that the Linux Foundation provides: https://jira.linuxfoundation.org/servicedesk/customer/portal/4 |
I've made the ticket, thank you! I'm not entirely sure why the view direction is not properly transformed in the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks like a good start, though I think it's a larger set of changes than you need, and I'd recommend the following simplifications:
- Instead of replacing the computation of world-space view direction with new vertex shader logic, I'd leave it as a pixel-shader subtraction of view position from vertex position, as this should provide the most accurate per-pixel values.
- Then, for your new object-space view direction, you can simply apply the
T_WORLD_INVERSE_TRANSPOSE_MATRIX
transform to the world-space view direction that you compute in the pixel shader, just before the vector has been normalized. - Finally, you can normalize your world/object view direction, making it a unit-length vector that can be returned as a result.
Thank you for the feedback! I made the requested changes, though the issue still persists. It seems like the transformation isn't being applied; why could this be the case? Thank you! |
@edobrowo This looks like a good step forward, and here are three suggestions for debugging the math of the new logic:
|
@jstone-lucasfilm Thanks for the tips. I've been mainly testing in the graph editor and the viewer. I realized that the world transform, inverse transpose, etc. are all the identity matrices, since presumably object space is equivalent to world space for viewing single objects. This also explains why the output to I tested by hardcoding arbitrary mat4 transformations into the object-space path, and confirmed that when the node input is set to object or model space, the transformation is in fact applied. What would be an effective way to create a test case where the object is offset from world space? Thank you again. |
@edobrowo Good catch, and that makes sense. We don't yet have an effective way to test non-identity world-space transforms in the viewer and graph editor, and this would likely be a good candidate for a future improvement to MaterialX! Given that context, this proposed change looks good to me, and I'll go ahead and run a few validations before merging. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work on this project, thanks @edobrowo!
860b8d4
into
AcademySoftwareFoundation:main
This addresses issue #1656.
Previously, attribute
space
in nodeviewdirection_vector3
was unused by GLSL and MSL implementations. This change correctly transforms the view direction from world space to the specified space.