-
-
Notifications
You must be signed in to change notification settings - Fork 35.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] GLTFLoader: Use node materials. #14149
Conversation
The first thing I'd check is the encoding injection code. // MeshStandardMaterial
vec4 mapTexelToLinear( vec4 value ) { return sRGBToLinear( value ); }
vec4 envMapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 emissiveMapTexelToLinear( vec4 value ) { return sRGBToLinear( value ); }
vec4 linearToOutputTexel( vec4 value ) { return LinearToGamma( value, float( GAMMA_FACTOR ) ); } // StandardNodeMaterial
vec4 mapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 envMapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 emissiveMapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 linearToOutputTexel( vec4 value ) { return LinearToGamma( value, float( GAMMA_FACTOR ) ); } |
That was exactly right — WebGLProgram doesn't know about node materials and can't detect the colorspace. I can't think of a clean way around that — StandardNodeMaterial could add a fake Normals are inverted again, because StandardNodeMaterial accepts only a scalar |
Not to derail this thread, but please have a look at #14069, so we are all aware of these annoying technical difficulties. |
I did? I thought I supported the flexibility of a bivariate |
Oops, you're right. That is exactly where the previous thread ended up. We could update StandardNodeMaterial to support vec2 normalScale if this is a direction we want to pursue, then. |
It should. I expect that was an oversight. /ping @sunag |
|
Only confusing because it's so convenient, thanks! 😅 Demos are getting getting close now, the remaining mismatch looks like #13501 has not applied here. |
014653f
to
506b9cb
Compare
With this PR would it be possible to use |
No, but I'd consider this PR just R&D at this point. It should not be merged without more of a long-term plan. |
Would it be ok to discuss some of those R&D results? I'm trying to understand the benefits but can't get over the fact that The exercise here is to see how to handle the The benefit should come when extending it into spec gloss? |
child.material.envMap = envMap; | ||
child.material.environment = new THREE.CubeTextureNode( envMap ); | ||
|
||
child.material.build(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A visual node editor that would generate the GLSL would expose "inputs". Rather than having node( type input)
after compiling it should just be type input
, if that makes sense? It feels weird to have to rebuild it, but this is basically in lieu of needsUpdate
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry I don't think I quite follow your question, but yes, .build()
is analogous to .needsUpdate
here.
One benefit is that it transfers much of the shader creation logic out of the renderer codebase. This could provide greater flexibility for extending default materials without patching shaders or modifying the renderer. For example, GLTFLoader would benefit from being able to assign different UVs to different map slots (#12608), for which we don't currently have a good solution. Unity has been going in a similar direction with Shader Graph.
This change — by itself — has no particular value to GLTFLoader yet. But I'd like to explore ways node materials might increase the flexibility of the material system. Maybe that means they move into core eventually, in which case |
@donmccurdy Just be aware that shader graphs reduce the ability to transfer shaders between toolsets. Basically they are too flexible to reliable convert between tools because each shader graph framework implements different nodes things differently. This has been a huge issue in the VFX world for the last decade. So there is a distinct tradeoff between portability and flexibility. Also very complex shaders, of the type you want in glTF can not be implemented via a node-based approach because it is inherently single pass and local. SSS, refraction, global illumination, etc. That said I support node-based shaders because of their flexibility but you will never be able to export their full flexibility into glTF. BTW here is a material graph transfer format that has recently been developed from the VFX space: http://www.materialx.org/ |
Was this even the goal here? The flexibility should benefit three.js not glTF? |
I have no expectation that arbitrary node-based shaders be exported to the current glTF spec, nor that the glTF spec should grow to support that — I'm interested in particular features (per-texture transforms, per-texture UV sets, etc...) that would benefit three.js generally, and also give us feature-parity with the glTF specification.
These are helpful points, thank you. As you mention these drawbacks, in what conditions do you support node-based materials? As a hypothetical strawman proposal, suppose that we...
... would that lose too much portability? Make SSS, GI, and refraction harder to implement? |
What is a My imagination conjured up a scenario where three.js loads some vertex and fragment shaders, not really caring if it was hand written, or built with a node tool. Node stuff seems to be twice as big as the shader templates. After all, at some point, any shader has to yield some valid GLSL, in form of a string.
If I put on my programmers hat, i like to write GLSL. I don't actually understand how to work with nodeMaterial programatically, i still see a lot of glsl and includes here: ^This is much harder for me to read and understand at a first glance. This on the other hand: Is fairly obvious. With a slight hint that these includes refer to ShaderChunk, i don't need to know how they are handled, and can read this as a GLSL program. At a glance, Node seems like a mix of templates and an unfamiliar API that's much more javascript than GLSL. If I put my tech artist hat, i could create crazy effects, new shaders, extend other shaders, all while not having to code at all. A visual editor is very intuitive for this workflow. In the end though, it would make more sense to save a json object with the vert,frag and uniforms which three could load via Rendering library ends up consuming shaders. Visual shader editor ends up outputting shaders. GLSL is common enough of an interface to make this work together, more effort is put into three.js to perhaps better consume these shaders. I'd explore removing the WebGLRenderer coupling with various materials. Perhaps some generic solution could be made that turns
I think these can be addressed either in the core, or with something like Could you link these |
I have always atrongly support shaders graphs...
#7522
Let's move to them and deal with the consequences. :)
Best regards,
Ben Houston
http://Clara.io Online 3d modeling and rendering
…On Tue, May 29, 2018, 7:56 PM Dusan Bosnjak ***@***.***> wrote:
2.Move Material classes to examples that assemble particular shader graphs
What is a shader graph, is this the same as saying "assemble particular
shaders".
My imagination conjured up a scenario where three.js loads some vertex and
fragment shaders, not really caring if it was hand written, or built with a
node tool.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#14149 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAj6_YDYMlGK-p3SDn8X2oA3vOH74Azrks5t3eBCgaJpZM4UOveP>
.
|
To be clear I think that after the next three.js release we should promote
nodematerial and demote the others and then figure out what breaks and fix
it. I think we just need to go for it.
Best regards,
Ben Houston
http://Clara.io Online 3d modeling and rendering
…On Tue, May 29, 2018, 9:47 PM Ben Houston ***@***.***> wrote:
I have always atrongly support shaders graphs...
#7522
Let's move to them and deal with the consequences. :)
Best regards,
Ben Houston
http://Clara.io Online 3d modeling and rendering
On Tue, May 29, 2018, 7:56 PM Dusan Bosnjak ***@***.***>
wrote:
> 2.Move Material classes to examples that assemble particular shader graphs
>
> What is a shader graph, is this the same as saying "assemble particular
> shaders".
>
> My imagination conjured up a scenario where three.js loads some vertex
> and fragment shaders, not really caring if it was hand written, or built
> with a node tool.
>
> —
> You are receiving this because you commented.
> Reply to this email directly, view it on GitHub
> <#14149 (comment)>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/AAj6_YDYMlGK-p3SDn8X2oA3vOH74Azrks5t3eBCgaJpZM4UOveP>
> .
>
|
LIke Blender's Cycles Nodes, Maya's Hypershade materials, Unreal's Material Expression Graphs, or Unity's Shader Graph. A particular and popular way of assembling a shader.
In my opinion the node materials do this rather well, and give a degree of flexibility that would be very hard to match in a fixed interface. #8278 has been open for a long time, stalled (if I were to guess) because choosing the right API is quite difficult. But other ideas are welcome, and in any case I am not suggesting we remove the
We would certainly want to have a visual editor, yes. That could be in the three.js editor or outside the project entirely.
Probably this could improve if the code were in const MeshStandardMaterial = function ( ... ) {
this.node = new StandardNodeMaterial();
};
Object.defineProperties(MeshStandardMaterial.prototype, {
map: {
set: function (texture) { this.node.color = new TextureNode( texture ); },
get: function () { return this.node.color.value; }
},
// ...
});
Sure! There are certainly alternatives, your approach #14166 is good. My own opinion is that nodes will be more flexible and easier to work, while not preventing hotpatching or use of ShaderMaterial for those who want it.
|
I just mustered up a quick example here: Dunno if this addresses this, but it could:
|
Here it is with the same transform interface: It's a bit hacky but it's a lot less code than an entire framework :/
This unfortunately doesn't ring true to me, historically, various shader injection apis were rejected in favor of I salute the spirit of yolo-ing this, but urge caution, if it's been waiting for three years, perhaps it could wait a tiny bit more and not block other PRs :) I think the code in the example looks a bit verbose but conceptually it's very few simple steps.
The last step is the verbose part, but i think it would be less verbose with If getters and setters are not super expensive, i see no reason to not decouple WebGLRenderer from "built-in" materials and keep the familiar interface and standard (GLSL). I'd like to see the three core get smaller. |
This bit makes me sad. I think i might be on a different page when it comes to ideas how a process like this should work. I don't have much experience with this, so I might have different expectations 😄 I see three as a helpful tool to build your own framework to render specific things. It should be flexible enough to allow one to choose for themselves what's the "right" API. If something stalls like #8278, that seems to yield no API. Even not having the right API but some API is better than that. The tooling for this also may stall, as seen in #13198. This really demotivated me to do any work, because I kinda disliked It's a shame that @donmccurdy class MeshStandardMaterial extends THREE.ShaderMaterial{
Object.defineProperties(MeshStandardMaterial.prototype, {
map: {
//set: function (texture) { this.node.color = new TextureNode( texture ); },
set: function (texture) { this.uniforms.color.value = texture ; },
//get: function () { return this.node.color.value; }
get: function () { return this.uniforms.color.value; }
},
// ...
}); |
@pailhead We have been maintaining our own Three.JS fork since 2016 because we needed #8278. I do believe that we can implement MeshStandardMaterial/MeshPhysicalMAterial on top of NodeMaterial. Thus providing backwards compatibility but a unified approach going forward. I believe that bringing in NodeMaterial to be a first class citizen is probably a better idea than getting stuck on #8278. It is a better long term solution. Whenever there is a major change to Three.JS you need to do it at the beginning of a release cycle. That is what happened when we replace the animation system for example: #6934 There is always a bunch of things that break and you need a few weeks so that everyone notices what broke and fixes it before a release -- otherwise it isn't fair to the Three.JS community. |
I think i'm still missing something. Why did you have to maintain a fork, and what would change with your fork when Would #8278 just work out of the box, or would it be easier for you to add some 3rd party code that allowed for this to work? What is that change between now and some point in future that makes life easier? Why couldn't your fork have turned into something like #14174? Is it easy to switch to your fork for this feature? Is it the only feature in the fork? Why couldn't it be not a fork, but a file/class that one imports and then it just works? It seems that this PR is a perfect example of this. You have a fork and you had a PR and you had it working, but the GLTFLoader completely stalled. I think it's unfair to gun for i appreciate your input. I'm really giving it my best shot to be open minded and am still convinced that there is something i'm missing. But without a good argument (or pointing where my confusion is) I can't take this at face value :( If #14174 gets refactored to be it's own file I'm also a lot on slack if anyone feels to discuss this offline ( or over a beer in SF ). |
I am not suggesting we deprecate var material = new StandardNodeMaterial();
material.color = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) );
Having #14174 as an example would be fine, and it's good to see that, but it does not solve the per-map UV issue fully. For example, I'd be hesitant to add it to GLTFLoader — it produces materials that cannot be easily inspected or modified by the user. Suppose the user wants to add instancing to their glTF model, do they need recursive
Personally I'd still consider shader injection APIs an escape hatch for special-case features, and would want something more idiomatic for important features, even if we weren't considering NodeMaterial. But in any case, it would be reassuring to confirm that shader injection still works with node materials — after |
Also, to put my vote in writing explicitly:
|
I don't really like that part about #13198 though, could be made into something that can easily be combined with the rest of the stuff. This would be an incremental change, compared to If the onBeforeRender remove gets merged, i think the next thing could be to try to refactor and combine various things with #13198 |
I think I have an issue with this: var material = new StandardNodeMaterial();
material.color = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) ); Shouldn't be compared to the code in It should be compared to this i think, if it makes sense: myMaterial.specularMap //Texture
myMaterial.specularMapOffset //Vector2
myMaterial.specularMapRepeat //Vector2
myMaterial.specularMapRotate //Number
myMaterial.specularMapUpdateMatrix // function I think the pattern with instantiating a lot of specific var offset = new THREE.FloatNode( 0 );
var scale = new THREE.FloatNode( 1 );
var uv = new THREE.UVNode();
var uvOffset = new THREE.OperatorNode(
offset,
uv,
THREE.OperatorNode.ADD
);
var uvScale = new THREE.OperatorNode(
uvOffset,
scale,
THREE.OperatorNode.MUL
);
var mask = new THREE.TextureNode( decalDiffuse, uvScale ); |
I think i'm wrapping my head around var material = new StandardNodeMaterial();
material.color = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) ); In a visual editor, I think this part overlaps with "hooks" mentioned with shader injection (both #11475 and #13198). If there is a way to easily insert a string containing GLSL whos only contract is to output something to some variable with some name, that would mimic the entire It took me a minute to figure out that normal maps exist here, (but i'm still not sure how it works): While, taking one look at this, it's immediately obvious what i need to do to change normal maps: This code might make more sense: var offset = new THREE.FloatNode( 0 );
var scale = new THREE.FloatNode( 1 );
var uv = new THREE.OperatorNode(
new THREE.OperatorNode(
offset,
new THREE.UVNode(),
THREE.OperatorNode.ADD
),
scale,
THREE.OperatorNode.MUL
);
var mask = new THREE.TextureNode( decalDiffuse, uv );
Object.defineProperty(material, textureOffset, { get: ()=>offset.value, set: v=>offset.value=v}) How you do this as a dev is up to you, but the interface ends up being
|
I think i missed my main point with the hooks up there. If one could do this: var myGLSLChunk = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) ).build() Three could still use includes and just GLSL, and not care how its generated? |
class GenericMaterial extends THREE.ShaderMaterial{
constructor(params){
super(params)
params.uniforms.forEach( uniform => this._wireUniformToParam( uniform ) ) //get set for each uniform based on type
}
}
const OLD_LIB = require('three-oldschool-shader-templates')
const NODE_LIB = require('/examples/NodeMaterial')
class StandardMaterial extends GenericMaterial {
static setTemplate = template => StandardMaterial.TEMPLATE = template
static TEMPLATE = `???`
constructor(params){
super({
template.uniforms,
template.vs,
template.fs,
})
this._setParams(params)
}
// templates for old conservatives
StandardMaterial.setLib(OLD_LIB.StandardMaterial)
// templates for young hip people :)
StandardMaterial.setLib(NODE_LIB.StandardNode.build())
const myMaterial = new StandardMaterial()
myMaterial.roughness = 1
myMaterial.myCustomRoughnessSpatialNoiseSeed = 12345 // no .value etc. The point is This i think is an interesting point, does your Ideally this would be
Both are |
@donmccurdy
This is a bit ambiguous. Someone who doesnt know GLSL and such, can play with the example and just do this: var mesh = new THREE.Mesh( new THREE.BufferGeometry, new THREE.MeshStandardMaterial )
decorateMaterialWithSpecGloss(mesh .material)
decorateMaterialWithPerMapTransforms(mesh .material)
decorateMaterialWithSimpleInstancing(material)
mesh.customDepthMaterial = decorateMaterialWithSimpleInstancing(new THREE.MeshDepthMaterial({ depthPacking: THREE.RGBADepthPacking })) If the user wants to add their own custom effect, they would consult this. |
@donmccurdy see above comment, now is the time to promode NodeMaterial to code, /cc @sunag |
@bhouston what makes now specifically better than any moment between now and 2015? |
@pailhead, that there appears to be critical mass in shifting towards it. I think everything has its time, and I think that now would be a good time for NodeMaterials to move towards core. |
With all due respect, I think that is a subjective opinion and not an objective one :) The critical mass seems to be forming by I'm confused why such a powerful feature like You have insight into whats involved in keeping this in core vs a fork. Do you have a thread like this where some findings are documented?
So it feels rather like a small, elite circle than real critical mass :) In the end, i'm not trying to derail this. The important thing for me is to understand why this is happening, and currently i don't have a good understanding. Why are other proposals blocked because of this, and is it really changing the In the end i learned a bit of regular expressions, so all is not lost :) |
Replied to #7522; will close this PR as it shouldn't be merged so long as NodeMaterial requires individual file inclusion this way. |
With various threads about
onBeforeCompile
vsonBeforeRender
for custom materials, per-map UV sets, and per-map texture transforms, thought I'd check on the status of node materials. The API is looking quite nice, and converting the GLTFLoader's mesh/standard material was straightforward.Demo: MeshStandardMaterial | StandardNodeMaterial
Either I haven't configured it properly or the PBR model isn't quite matching up.I've included a small hack to fix the colorspace, and normals are inverted. I'm in no rush to merge something like this (wouldn't expect users to include all of these nodes individually, for starters) but wanted to start testing node materials more and iron out any issues.