Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] GLTFLoader: Use node materials. #14149

Closed
wants to merge 3 commits into from

Conversation

donmccurdy
Copy link
Collaborator

@donmccurdy donmccurdy commented May 26, 2018

EDIT: This is an exploratory PR, testing out node materials. It is not meant to be merged right now.

With various threads about onBeforeCompile vs onBeforeRender for custom materials, per-map UV sets, and per-map texture transforms, thought I'd check on the status of node materials. The API is looking quite nice, and converting the GLTFLoader's mesh/standard material was straightforward.

material preview
MeshStandardMaterial screen shot 2018-05-26 at 11 54 34 am
StandardNodeMaterial screen shot 2018-05-26 at 11 54 36 am

Demo: MeshStandardMaterial | StandardNodeMaterial

Either I haven't configured it properly or the PBR model isn't quite matching up. I've included a small hack to fix the colorspace, and normals are inverted. I'm in no rush to merge something like this (wouldn't expect users to include all of these nodes individually, for starters) but wanted to start testing node materials more and iron out any issues.

@WestLangley
Copy link
Collaborator

The first thing I'd check is the encoding injection code.

// MeshStandardMaterial
vec4 mapTexelToLinear( vec4 value ) { return sRGBToLinear( value ); }
vec4 envMapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 emissiveMapTexelToLinear( vec4 value ) { return sRGBToLinear( value ); }
vec4 linearToOutputTexel( vec4 value ) { return LinearToGamma( value, float( GAMMA_FACTOR ) ); }
// StandardNodeMaterial
vec4 mapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 envMapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 emissiveMapTexelToLinear( vec4 value ) { return LinearToLinear( value ); }
vec4 linearToOutputTexel( vec4 value ) { return LinearToGamma( value, float( GAMMA_FACTOR ) ); }

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented May 26, 2018

That was exactly right — WebGLProgram doesn't know about node materials and can't detect the colorspace. I can't think of a clean way around that — StandardNodeMaterial could add a fake material.map.encoding property to force the right mapTexelToLinear function? Added a hacky workaround for now, assuming map and emissive will always be sRGB (which is true for glTF, anyway).

Normals are inverted again, because StandardNodeMaterial accepts only a scalar normalScale and we can't invert normalScale.y. But as you've mentioned earlier that's probably best, and we should find another way to deal with other tangent space coordinate conventions.

@WestLangley
Copy link
Collaborator

WebGLProgram doesn't know about node materials and can't detect the colorspace. I can't think of a clean way around that

Not to derail this thread, but please have a look at #14069, so we are all aware of these annoying technical difficulties.

@WestLangley
Copy link
Collaborator

Normals are inverted again, because StandardNodeMaterial accepts only a scalar normalScale and we can't invert normalScale.y. But as you've mentioned earlier that's probably best, and we should find another way to deal with other tangent space coordinate conventions.

I did? I thought I supported the flexibility of a bivariate normalScale and argued the the glTF univariate convention was too limiting.

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented May 26, 2018

I did? I thought I supported the flexibility of a bivariate normalScale ...

Oops, you're right. That is exactly where the previous thread ended up. We could update StandardNodeMaterial to support vec2 normalScale if this is a direction we want to pursue, then.

@WestLangley
Copy link
Collaborator

We could update StandardNodeMaterial to support vec2 normalScale

It should. I expect that was an oversight.

/ping @sunag

@sunag
Copy link
Collaborator

sunag commented May 27, 2018

normalScale is already v2. what you may confuse is that float and others formats is converted automatically to the input format, in this case vec2. you can use a FloatNode as well asVector2Node.

var normalScale = this.normalScale && this.normal ? this.normalScale.buildCode( builder, 'v2' ) : undefined;

@donmccurdy
Copy link
Collaborator Author

... what you may confuse is that float and others formats is converted automatically to the input format, in this case vec2. you can use a FloatNode as well asVector2Node.

Only confusing because it's so convenient, thanks! 😅

Demos are getting getting close now, the remaining mismatch looks like #13501 has not applied here.

@donmccurdy donmccurdy force-pushed the feat-gltf-node-materials branch from 014653f to 506b9cb Compare May 27, 2018 06:43
@pailhead
Copy link
Contributor

pailhead commented May 28, 2018

With this PR would it be possible to use GLTFLoader without /nodes?

@donmccurdy
Copy link
Collaborator Author

With this PR would it be possible to use GLTFLoader without /nodes?

No, but I'd consider this PR just R&D at this point. It should not be merged without more of a long-term plan.

@pailhead
Copy link
Contributor

pailhead commented May 28, 2018

Would it be ok to discuss some of those R&D results?

I'm trying to understand the benefits but can't get over the fact that NodeMaterial seems like it's own framework. All of these visual editors i've seen, generate the actual shader code - one giant string, before it gets compiled.

The exercise here is to see how to handle the GLTFLoader generated materials, via something that is available, but not in the core? It doesn't seem to affect much code wise, just changes the interface a bit?

The benefit should come when extending it into spec gloss?

child.material.envMap = envMap;
child.material.environment = new THREE.CubeTextureNode( envMap );

child.material.build();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A visual node editor that would generate the GLSL would expose "inputs". Rather than having node( type input) after compiling it should just be type input, if that makes sense? It feels weird to have to rebuild it, but this is basically in lieu of needsUpdate?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry I don't think I quite follow your question, but yes, .build() is analogous to .needsUpdate here.

@donmccurdy
Copy link
Collaborator Author

I'm trying to understand the benefits but can't get over the fact that NodeMaterial seems like it's own framework.

One benefit is that it transfers much of the shader creation logic out of the renderer codebase. This could provide greater flexibility for extending default materials without patching shaders or modifying the renderer. For example, GLTFLoader would benefit from being able to assign different UVs to different map slots (#12608), for which we don't currently have a good solution. Unity has been going in a similar direction with Shader Graph.

The exercise here is to see how to handle the GLTFLoader generated materials, via something that is available, but not in the core? It doesn't seem to affect much code wise, just changes the interface a bit?

This change — by itself — has no particular value to GLTFLoader yet. But I'd like to explore ways node materials might increase the flexibility of the material system. Maybe that means they move into core eventually, in which case MeshFooMaterials become helpers that just construct an appropriate node graph? That's a complex and large change, so I certainly don't think it should block changes like #14099, but it feels promising enough to spend more time on.

@bhouston
Copy link
Contributor

@donmccurdy Just be aware that shader graphs reduce the ability to transfer shaders between toolsets. Basically they are too flexible to reliable convert between tools because each shader graph framework implements different nodes things differently. This has been a huge issue in the VFX world for the last decade.

So there is a distinct tradeoff between portability and flexibility.

Also very complex shaders, of the type you want in glTF can not be implemented via a node-based approach because it is inherently single pass and local. SSS, refraction, global illumination, etc.

That said I support node-based shaders because of their flexibility but you will never be able to export their full flexibility into glTF.

BTW here is a material graph transfer format that has recently been developed from the VFX space: http://www.materialx.org/

@pailhead
Copy link
Contributor

That said I support node-based shaders because of their flexibility but you will never be able to export their full flexibility into glTF.

Was this even the goal here? The flexibility should benefit three.js not glTF?

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented May 29, 2018

That said I support node-based shaders because of their flexibility but you will never be able to export their full flexibility into glTF.

Was this even the goal here? The flexibility should benefit three.js not glTF?

I have no expectation that arbitrary node-based shaders be exported to the current glTF spec, nor that the glTF spec should grow to support that — I'm interested in particular features (per-texture transforms, per-texture UV sets, etc...) that would benefit three.js generally, and also give us feature-parity with the glTF specification.

Just be aware that shader graphs reduce the ability to transfer shaders between toolsets. Basically they are too flexible to reliable convert between tools because each shader graph framework implements different nodes things differently. This has been a huge issue in the VFX world for the last decade. So there is a distinct tradeoff between portability and flexibility. ... That said I support node-based shaders ...

These are helpful points, thank you. As you mention these drawbacks, in what conditions do you support node-based materials? As a hypothetical strawman proposal, suppose that we...

  1. Bring node materials into three.js src/*
  2. Move Material classes to examples that assemble particular shader graphs

... would that lose too much portability? Make SSS, GI, and refraction harder to implement?

@pailhead
Copy link
Contributor

pailhead commented May 29, 2018

2.Move Material classes to examples that assemble particular shader graphs

What is a shader graph in this context? Is this the same as saying "assemble particular shaders"?

My imagination conjured up a scenario where three.js loads some vertex and fragment shaders, not really caring if it was hand written, or built with a node tool.

Node stuff seems to be twice as big as the shader templates. After all, at some point, any shader has to yield some valid GLSL, in form of a string.

onBeforeCompile vs onBeforeRender

onBeforeRender should probably have no place in this discussion, but what are the benefits of node over onBeforeCompile?


If I put on my programmers hat, i like to write GLSL. I don't actually understand how to work with nodeMaterial programatically, i still see a lot of glsl and includes here:
https://github.com/mrdoob/three.js/blob/dev/examples/js/nodes/materials/StandardNode.js

^This is much harder for me to read and understand at a first glance. This on the other hand:
https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderLib/meshphysical_vert.glsl

Is fairly obvious. With a slight hint that these includes refer to ShaderChunk, i don't need to know how they are handled, and can read this as a GLSL program. At a glance, Node seems like a mix of templates and an unfamiliar API that's much more javascript than GLSL. build() seems radically different than how it currently works, which is just provide a string and be done with it.

If I put my tech artist hat, i could create crazy effects, new shaders, extend other shaders, all while not having to code at all. A visual editor is very intuitive for this workflow. In the end though, it would make more sense to save a json object with the vert,frag and uniforms which three could load via NodeMaterialLoader? One could possibly then run something like this https://github.com/aras-p/glsl-optimizer on the shader.

Rendering library ends up consuming shaders. Visual shader editor ends up outputting shaders. GLSL is common enough of an interface to make this work together, more effort is put into three.js to perhaps better consume these shaders.

I'd explore removing the WebGLRenderer coupling with various materials. Perhaps some generic solution could be made that turns ShaderMaterial into various materials with a friendlier interface. (defines a bunch of getters and setters to uniforms and such)? It would then be a matter of convention, either the editor outputs the same variables that three uses, or somehow it maps.

(per-texture transforms, per-texture UV sets, etc...)

I think these can be addressed either in the core, or with something like onBeforeCompile.

#12788

@donmccurdy

Could you link these GLTFLoader issues that are mentioned here, so that alternate explorations to NodeMaterial could be done?

@bhouston
Copy link
Contributor

bhouston commented May 30, 2018 via email

@bhouston
Copy link
Contributor

bhouston commented May 30, 2018 via email

@donmccurdy
Copy link
Collaborator Author

What is a shader graph in this context? Is this the same as saying "assemble particular shaders"?

LIke Blender's Cycles Nodes, Maya's Hypershade materials, Unreal's Material Expression Graphs, or Unity's Shader Graph. A particular and popular way of assembling a shader.

I'd explore removing the WebGLRenderer coupling with various materials. Perhaps some generic solution could be made that turns ShaderMaterial into various materials with a friendlier interface. (defines a bunch of getters and setters to uniforms and such)?

In my opinion the node materials do this rather well, and give a degree of flexibility that would be very hard to match in a fixed interface. #8278 has been open for a long time, stalled (if I were to guess) because choosing the right API is quite difficult. But other ideas are welcome, and in any case I am not suggesting we remove the ShaderMaterial class. TangramJS's declarative shaders are a nice example of a different approach, although they're specifically for map visualization.

If I put my tech artist hat, i could create crazy effects, new shaders, extend other shaders, all while not having to code at all. A visual editor is very intuitive for this workflow.

We would certainly want to have a visual editor, yes. That could be in the three.js editor or outside the project entirely.

If I put on my programmers hat, i like to write GLSL. I don't actually understand how to work with nodeMaterial programatically, i still see a lot of glsl and includes here:

Probably this could improve if the code were in src/ and able to use our rollup GLSL plugin, but it is a fair point. I think 'forking' a material becomes easier — nothing is hidden in the renderer classes — but using existing materials directly is somewhat harder. That can be avoided with backward-compatible shortcuts, perhaps:

const MeshStandardMaterial = function ( ... ) {
  this.node = new StandardNodeMaterial();
};
Object.defineProperties(MeshStandardMaterial.prototype, {
  map: {
    set: function (texture) { this.node.color = new TextureNode( texture ); },
    get: function () { return this.node.color.value; }
  },
  // ...
});

Could you link these GLTFLoader issues that are mentioned here, so that alternate explorations to NodeMaterial could be done?

Sure! There are certainly alternatives, your approach #14166 is good. My own opinion is that nodes will be more flexible and easier to work, while not preventing hotpatching or use of ShaderMaterial for those who want it.

@pailhead
Copy link
Contributor

pailhead commented May 30, 2018

@donmccurdy

I just mustered up a quick example here:
http://dusanbosnjak.com/test/webGL/three-materials-extended/webgl_materials_extended_multiple_uvs.html

Dunno if this addresses this, but it could:

No per-map texture transforms

@pailhead
Copy link
Contributor

pailhead commented May 30, 2018

Here it is with the same transform interface:
http://dusanbosnjak.com/test/webGL/three-materials-extended/webgl_materials_extended_multiple_uvs_props_transform.html

It's a bit hacky but it's a lot less code than an entire framework :/

while not preventing hotpatching or use of ShaderMaterial for those who want it.

This unfortunately doesn't ring true to me, historically, various shader injection apis were rejected in favor of NodeMaterial. It feels more like alternatives are being blocked on account of this :(

I salute the spirit of yolo-ing this, but urge caution, if it's been waiting for three years, perhaps it could wait a tiny bit more and not block other PRs :)


I think the code in the example looks a bit verbose but conceptually it's very few simple steps.

  1. create material
  2. declare some list of map names you want to extend
  3. compensate for onBeforeCompile not being named onBeforeParse (there is no GLSL available in neither vert not frag)
  4. make a utility function to replace the GLSL statements and wire the uniforms

The last step is the verbose part, but i think it would be less verbose with require() and es6 and especially different API.

If getters and setters are not super expensive, i see no reason to not decouple WebGLRenderer from "built-in" materials and keep the familiar interface and standard (GLSL). I'd like to see the three core get smaller.

@pailhead
Copy link
Contributor

pailhead commented May 30, 2018

#8278 has been open for a long time, stalled (if I were to guess) because choosing the right API is quite difficult.

This bit makes me sad. I think i might be on a different page when it comes to ideas how a process like this should work. I don't have much experience with this, so I might have different expectations 😄

I see three as a helpful tool to build your own framework to render specific things. It should be flexible enough to allow one to choose for themselves what's the "right" API. If something stalls like #8278, that seems to yield no API. Even not having the right API but some API is better than that.

The tooling for this also may stall, as seen in #13198. This really demotivated me to do any work, because I kinda disliked onBeforeCompile and didn't understand why it had to be treated as a constant. But at least I learned some regular expressions coming up with one of these examples 😄, and now i kinda like it more.

It's a shame that onBeforeCompile is being deprecated before it even had a chance to shine. At least, it might be useful to link some 3rd party examples to these listed issues (i've done that) to explain how these can be solved currently, and going back a few versions. If a lot of things break as @bhouston mentioned, users could use something in the meantime?

@donmccurdy
Re: decoupling materials from renderer

class MeshStandardMaterial extends THREE.ShaderMaterial{

    Object.defineProperties(MeshStandardMaterial.prototype, {
      map: {
        //set: function (texture) { this.node.color = new TextureNode( texture ); },
        set: function (texture) { this.uniforms.color.value = texture ; },
        //get: function () { return this.node.color.value; }
        get: function () { return this.uniforms.color.value; }
      },
   // ...
});

@bhouston
Copy link
Contributor

@pailhead We have been maintaining our own Three.JS fork since 2016 because we needed #8278.

I do believe that we can implement MeshStandardMaterial/MeshPhysicalMAterial on top of NodeMaterial. Thus providing backwards compatibility but a unified approach going forward.

I believe that bringing in NodeMaterial to be a first class citizen is probably a better idea than getting stuck on #8278. It is a better long term solution.

Whenever there is a major change to Three.JS you need to do it at the beginning of a release cycle. That is what happened when we replace the animation system for example: #6934 There is always a bunch of things that break and you need a few weeks so that everyone notices what broke and fixes it before a release -- otherwise it isn't fair to the Three.JS community.

@pailhead
Copy link
Contributor

pailhead commented May 30, 2018

I think i'm still missing something. Why did you have to maintain a fork, and what would change with your fork when NodeMaterial becomes core?

Would #8278 just work out of the box, or would it be easier for you to add some 3rd party code that allowed for this to work? What is that change between now and some point in future that makes life easier?

Why couldn't your fork have turned into something like #14174? Is it easy to switch to your fork for this feature? Is it the only feature in the fork? Why couldn't it be not a fork, but a file/class that one imports and then it just works?

It seems that this PR is a perfect example of this.

You have a fork and you had a PR and you had it working, but the GLTFLoader completely stalled. I think it's unfair to gun for NodeMaterial for this reason alone. One thing that's already 3rd party and in /examples (GLTFLoader) may need another thing from /examples (NodeMaterial), no need for /src anywhere here.

@bhouston

i appreciate your input. I'm really giving it my best shot to be open minded and am still convinced that there is something i'm missing. But without a good argument (or pointing where my confusion is) I can't take this at face value :(

If #14174 gets refactored to be it's own file Multi_UV_Material_Extension.js, if it's less than 100 lines of code, what would be the arguments against having it as an example or optional utility? Is there some other approach to make the stuff from your fork available for the GLTFLoader and other classes?

I'm also a lot on slack if anyone feels to discuss this offline ( or over a beer in SF ).

@donmccurdy
Copy link
Collaborator Author

donmccurdy commented May 31, 2018

It's a shame that onBeforeCompile is being deprecated before it even had a chance to shine.

I am not suggesting we deprecate onBeforeCompile — it is a useful escape hatch, and will likely remain so with NodeMaterials. But if that is the only way of extending three.js materials, I don't think we end up with an API that will shine. Compare the complexity of implementing per-map UVs with NodeMaterial:

var material = new StandardNodeMaterial();
material.color = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) );

If #14174 gets refactored to be it's own file Multi_UV_Material_Extension.js, if it's less than 100 lines of code, what would be the arguments against having it as an example or optional utility?

Having #14174 as an example would be fine, and it's good to see that, but it does not solve the per-map UV issue fully. For example, I'd be hesitant to add it to GLTFLoader — it produces materials that cannot be easily inspected or modified by the user. Suppose the user wants to add instancing to their glTF model, do they need recursive onBeforeCompile callbacks?

while not preventing hotpatching or use of ShaderMaterial for those who want it.

This unfortunately doesn't ring true to me, historically, various shader injection apis were rejected in favor of NodeMaterial. It feels more like alternatives are being blocked on account of this.

Personally I'd still consider shader injection APIs an escape hatch for special-case features, and would want something more idiomatic for important features, even if we weren't considering NodeMaterial. But in any case, it would be reassuring to confirm that shader injection still works with node materials — after material.build() has been called, you can modify material.vertexShader, material.fragmentShader, and material.uniforms. And material.onBeforeCompile is still there. So a reasonable sanity-check here would be to implement something that NodeMaterial does not support (say, instancing) on top of it.

@donmccurdy
Copy link
Collaborator Author

Also, to put my vote in writing explicitly:

  • We should do a few more feasibility-tests like this one, including verifying that shaders can still be hotpatched and extended happily. 😅
  • Assuming no major issues, NodeMaterial should either go in src/ or become a new build output, e.g. build/three-nodematerial.js, that can be added to projects atomically.
  • Eventually (perhaps not in the same release as the steps above) we simplify Material classes to assemble backward-compatible node graphs, keeping ShaderMaterial and onBeforeCompile around.

@pailhead
Copy link
Contributor

pailhead commented May 31, 2018

Suppose the user wants to add instancing to their glTF model, do they need recursive onBeforeCompile callbacks?

I don't really like that part about onBeforeCompile (amongst some other things), but in this particular case, the channel thing would benefit from onBeforeParse and onBeforeCompile being part of the same API.

#13198 though, could be made into something that can easily be combined with the rest of the stuff. This would be an incremental change, compared to NodeMaterial.

If the onBeforeRender remove gets merged, i think the next thing could be to try to refactor and combine various things with #13198

@pailhead
Copy link
Contributor

I think I have an issue with this:

var material = new StandardNodeMaterial();
material.color = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) );

Shouldn't be compared to the code in onBeforeCompile and regular expressions.

It should be compared to this i think, if it makes sense:

myMaterial.specularMap       //Texture
myMaterial.specularMapOffset //Vector2
myMaterial.specularMapRepeat //Vector2
myMaterial.specularMapRotate //Number
myMaterial.specularMapUpdateMatrix // function

I think the pattern with instantiating a lot of specific Nodes is something that an average user might understand as much as they understand GLSL. Even the example you gave could benefit from some simplified input. I found this which seems to be a transform, but without the rotation, it's already a lot more code than the simplified familiar interface that we already have (offset,repeat,rotation).

var offset = new THREE.FloatNode( 0 );
var scale = new THREE.FloatNode( 1 );
var uv = new THREE.UVNode();

var uvOffset = new THREE.OperatorNode(
	offset,
	uv,
	THREE.OperatorNode.ADD
);

var uvScale = new THREE.OperatorNode(
	uvOffset,
	scale,
	THREE.OperatorNode.MUL
);

var mask = new THREE.TextureNode( decalDiffuse, uvScale );

@donmccurdy donmccurdy mentioned this pull request May 31, 2018
@pailhead
Copy link
Contributor

I think i'm wrapping my head around NodeMaterial a bit more, granted i haven't yet played with the code itself.

var material = new StandardNodeMaterial();
material.color = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) );

In a visual editor, StandardNodeMaterial would be my root, the UVNode a leaf probably, texture too, while TextureNode would would have two inputs and one output into material's color?

I think this part overlaps with "hooks" mentioned with shader injection (both #11475 and #13198). If there is a way to easily insert a string containing GLSL whos only contract is to output something to some variable with some name, that would mimic the entire TextureNode branch from the example above. Looking at source code, this makes more sense to me. As a programmer, i think how these nodes are assembled is much more obfuscated. It's a whole different story with a visual editor though.

It took me a minute to figure out that normal maps exist here, (but i'm still not sure how it works):
https://github.com/mrdoob/three.js/blob/dev/examples/js/nodes/materials/StandardNode.js
I think with this i need to know both a framework NodeMaterial and a bit of GLSL.

While, taking one look at this, it's immediately obvious what i need to do to change normal maps:
https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderChunk/normal_fragment_maps.glsl
http://192.241.199.119:8080/dev/testwp/
With this i just need to know how GLSL works, it's less dependencies.

This code might make more sense:

var offset = new THREE.FloatNode( 0 );
var scale = new THREE.FloatNode( 1 );

var uv = new THREE.OperatorNode(
	new THREE.OperatorNode(
		offset,
		new THREE.UVNode(),
		THREE.OperatorNode.ADD
	),
	scale,
	THREE.OperatorNode.MUL
);

var mask = new THREE.TextureNode( decalDiffuse, uv );

Object.defineProperty(material, textureOffset, { get: ()=>offset.value, set: v=>offset.value=v})

How you do this as a dev is up to you, but the interface ends up being

material.textureOffset = new THREE.Vector2()
//vs
material.textureOffset.value = new THREE.Vector2()

@pailhead
Copy link
Contributor

I think i missed my main point with the hooks up there.

If one could do this:

var myGLSLChunk = new THREE.TextureNode( texture, new THREE.UVNode( 1 ) ).build()

Three could still use includes and just GLSL, and not care how its generated?

@pailhead
Copy link
Contributor

pailhead commented May 31, 2018

class GenericMaterial extends THREE.ShaderMaterial{
  constructor(params){
    super(params)
    params.uniforms.forEach( uniform => this._wireUniformToParam( uniform )  ) //get set for each uniform based on type
  }
}

const OLD_LIB = require('three-oldschool-shader-templates')
const NODE_LIB = require('/examples/NodeMaterial')

class StandardMaterial extends GenericMaterial {
  static setTemplate = template => StandardMaterial.TEMPLATE = template
  static TEMPLATE = `???`
  constructor(params){
    super({
      template.uniforms,
      template.vs,
      template.fs,
    })
    this._setParams(params)
}


// templates for old conservatives
StandardMaterial.setLib(OLD_LIB.StandardMaterial)

// templates for young hip people :)
StandardMaterial.setLib(NODE_LIB.StandardNode.build())

const myMaterial = new StandardMaterial()

myMaterial.roughness = 1
myMaterial.myCustomRoughnessSpatialNoiseSeed = 12345 // no .value etc. 

The point is myCustomRoughnessSpatialNoiseSeed would be the preferred interface and would automagically be available. I think i've seen this in ShaderFX or some such package for 3d studio max? If i put some kind of float node or such, i would get it automatically in the GUI.

This i think is an interesting point, does your StandardSGMaterial or StandardSGMaterialWithUVChannels need to go through another transformation step? WebGL compiles it, three parses it, does it really need to handle the JS -> string transformation at runtime too?

Ideally this would be

var m = new StandardMaterial({
   specGloss: true,
   perMapUvTransform: true
   ...
})
m.glossMapOffset.set(0.5,0.5)
m.glossMapRepeat.set(2,2)
MyApp.extendMaterialWithResolutionEffect( m, GLOBAL_RESOLUTION )
var m = new MyCustomEffectMaterial(params)
m.inputFoo = 5
m.inputBar.set(1,2)
m.setGlobalResolution( GLOBAL_RESOLUTION )

Both are ShaderMaterials with no opinions if the shader is hand written, or generated with some tool.

@pailhead
Copy link
Contributor

pailhead commented Jun 3, 2018

@donmccurdy
I've made an example to tackle this:

Having #14174 as an example would be fine, and it's good to see that, but it does not solve the per-map UV issue fully. For example, I'd be hesitant to add it to GLTFLoader — it produces materials that cannot be easily inspected or modified by the user. Suppose the user wants TO ADD* instancing to their glTF model, do they need recursive onBeforeCompile callbacks?

If you don't want #14174

if you do want it.

User wants to add

This is a bit ambiguous. Someone who doesnt know GLSL and such, can play with the example and just do this:

var mesh = new THREE.Mesh( new THREE.BufferGeometry, new THREE.MeshStandardMaterial )

decorateMaterialWithSpecGloss(mesh .material)
decorateMaterialWithPerMapTransforms(mesh .material)

decorateMaterialWithSimpleInstancing(material)
mesh.customDepthMaterial = decorateMaterialWithSimpleInstancing(new THREE.MeshDepthMaterial({ depthPacking: THREE.RGBADepthPacking }))

If the user wants to add their own custom effect, they would consult this.

@bhouston
Copy link
Contributor

bhouston commented Jun 4, 2018

@mrdoob has made a great new Three.js release. If you want, now is the time to ask @mrdoob's permission to promote NodeMaterial and see what you can do with it. :)

@bhouston
Copy link
Contributor

bhouston commented Jun 4, 2018

@donmccurdy see above comment, now is the time to promode NodeMaterial to code, /cc @sunag

@pailhead
Copy link
Contributor

pailhead commented Jun 4, 2018

@bhouston what makes now specifically better than any moment between now and 2015?

@bhouston
Copy link
Contributor

bhouston commented Jun 4, 2018

@pailhead, that there appears to be critical mass in shifting towards it. I think everything has its time, and I think that now would be a good time for NodeMaterials to move towards core.

@pailhead
Copy link
Contributor

pailhead commented Jun 4, 2018

@bhouston

With all due respect, I think that is a subjective opinion and not an objective one :)

The critical mass seems to be forming by GLTFLoader stake holders, and my (subjective) observation is that need arose because of improperly documented onBeforeRender (few examples) and even less documented onBeforeCompile (one example).

I'm confused why such a powerful feature like onBeforeCompile got introduced a year ago but not used for this. It could have solved this a long time ago. Why is it there in the first place?

You have insight into whats involved in keeping this in core vs a fork. Do you have a thread like this where some findings are documented?

This sucks when keeping the fork in sync, but it's better than using onBeforeCompile because of FOO

So it feels rather like a small, elite circle than real critical mass :) In the end, i'm not trying to derail this.

The important thing for me is to understand why this is happening, and currently i don't have a good understanding. Why are other proposals blocked because of this, and is it really changing the GLSL language for a javascript pattern (composition, objects)?

In the end i learned a bit of regular expressions, so all is not lost :)

@pailhead
Copy link
Contributor

pailhead commented Jun 4, 2018

@bhouston

either way i think the critical mass discussion should be had here #7522. This PR/thread is specific to GLTFLoader. It would be good to discuss technical implications of having something like GLTFLoader coupled with /nodes, if it's the only way that this has to be done, etc.

@donmccurdy
Copy link
Collaborator Author

Replied to #7522; will close this PR as it shouldn't be merged so long as NodeMaterial requires individual file inclusion this way.

@donmccurdy donmccurdy closed this Jun 4, 2018
@mrdoob
Copy link
Owner

mrdoob commented Jun 16, 2018

@bhouston

@mrdoob has made a great new Three.js release. If you want, now is the time to ask @mrdoob's permission to promote NodeMaterial and see what you can do with it. :)

We're getting closer! 😀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants