-
-
Notifications
You must be signed in to change notification settings - Fork 35.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TSL nodes Composability / Stacking / Extends ? #29995
Comments
String substitution is counterintuitive with Node systems, one of the goals of its creation is to avoid these types of hacks, unlike a fixed code/pipeline, the Node system creates variables and code dynamically, so this code will be different with each modification, not always matching the |
@sunag I think the fundamental use case is augmenting and adding new surface qualities on top of an existing material - eg a user provided one, one loaded through a library, instantiated through three.js, etc. In my visualization work it's been common to take pre-made materials and add different "layers" of effects on them such as topographic lines, clipping (alphatest / transparency / discard), vertex displacement, and so on and blending it on top of existing effects provided by the pre-existing material. Previously you could do this by augmenting fragment or vertex strings but as you say this wouldn't be viable any more. I'm still not so familiar with the node material system and perhaps there's already a way but intuitively I would expect to be able to "unhook" whatever is feeding into the color node (or any other), then hook the old and new color effect into a blend node, and then hook that blend node back up to the color node. I'm also curious to hear how this might be done, though. |
@gkjohnson said it all ! |
There are currently two main ways to extend a Material/Node, the first is through its inputs. Previously, to modify an existing material code we needed to modify the shader by injecting strings, now all we need to do is add a node in the desired input with @brunosimon recently made a great explanation about The management of the declarations or the code sequence are generated dynamically, which allows to easily move and reuse graphics in different material processes. For example, a topographic lines effect could be created in a node function like The TSL functions allow access to material and geometry properties, uniforms and attributes, including native code. Each function can generate its own inline displaceconst displacementMap = texture( map/*, uv()*/ );
const displacementScale = uniform( 1 );
// custom displace
material.positionLocal = positionLocal.add( normalLocal.mul( displacementMap.r.mul( displacementScale ) ) ); using Fnconst displaceIt = Fn( ( [ displacement, scale = float( 1 ) ] ) => {
return positionLocal.add( normalLocal.mul( displacement.mul( scale ) ) );
} );
const displacementMap = texture( map/*, uv()*/ );
const displacementScale = uniform( 1 );
// custom displace
material.positionLocal = displaceIt( displacementMap.r, displacementScale ); using Fn with embeded uniformsconst displaceIt = Fn( () => {
const displacementMap = texture( map/*, uv()*/ );
const displacementScale = uniform( 1 );
// some other logic here
return positionLocal.add( normalLocal.mul( displacementMap.r.mul( displacementScale ) ) );
} );
// custom displace
material.positionLocal = displaceIt(); Another way would be to extend the Material classes. If you want to create a Material with a different lighting model, this is also possible by extending the classes and methods such as Nodes can also be extended to create more effects that require rendering manipulation like |
Hey @sunag thanks for this answer and explanations, everything of this is understood, However, as I tried to explain, I think there is one crucial case missing in here that @gkjohnson explained better than I did previously But I guess an example is better : In the legacy system, we could achieve this with a simple material extension:
This approach had several benefits:It could modify the final color output after ALL material calculations were done ( before the FOG calcs ) How to achieve this effect in TSL ?There are multiple things here that I can think of but colorNode isn't viable because, it executes before lighting calculations, would need knowledge of material properties ( textures etc.. ) fragmentNode isn't suitable because, it requires reimplementing all material calculations, defeats the purpose of a lightweight color modifier, but is also tied with the actual material properties In the case a colorSpaceFragmentNode exists ( which I am not aware of, and not even sure this matter in webGPU stills ) I'd still need to get access to the current node value of the colorSpaceFragmentNode to add the plugin before it, write the value in the computed value, and pass it on to the next node Which approach would best align with TSL's architecture while maintaining the extend simplicity of the legacy system? I bumped into this problem while porting an entire code-base of 200+ shaders into TSL, and got troubles on 90% of the materials that are extending built-ins threeJS materials, most of the materials are using simple extends like this example or complex ones, but also most of the times multiples extends stacked on top of the others to achieve complex shading while not being aware of previous vertex or surfaces calculations I wish there were a way to queue nodes before / after other nodes, and pass on the output value from one to the next one so it each sequence could be modified by multiples nodes instead of one. For a true extend of a built-in material, It is not needed to re-write a colorNode, It is needed to extend a colorNode before or after without being aware of the actual content of the built-in colorNode and used parameters ( like maps etc.. ) Another example Let say you need to introduce a opacity fade with a discard when an object is too close from the camera, how do you proceed ? Writing a opacityNode, but then, even writing a opacityNode causes troubles because it looses all the built-in opacity calculations Again, this needs to get knowledge of the actual material, and won't work straight regardless of the actual material property, but a stacked opacityNode that retrieves the output value of the built-in opacity node could then process the new output Possible solution or not Each node would accept an array of node function, on which you could select a node.defaultColorNode that is filled properly per material on which I can add before, or after a new node to extend the behavior Each stacked node would pass the output of its own function into the next one for chaining computation
|
Using TSL since 4months intensively for production, I also felt this need. I think a good approach / quickfix at the moment would be :
This way we can extends them in an easier way and get the values previously returned. |
The replacement is done for optimization reasons. Once the user uses a You can add a Example: const myOpacity = Fn( () => {
If ( a.lessThan( b ), () => {
Discard();
} )
return c;
} );
material.opacityNode = myOpacity(); To control the output of materials, we have the Example: material.outputNode = hue_shift( output, time ); Color Space is always applied in post-processing, whether internal to the renderer or explicit through the Thanks again to @Mugen87 for starting the documentation, and I will look into improving the TSL documentation as well, most of the issues currently are related to that. |
Hey @sunag thanks a lot for the answer, The part : Before setting the nodes : After setting the nodes : The left mesh material in the fiddle has a map property, but is overriden when setting the colorNode
Building a generic plugin for any material regardless of the property of the material is not possible at the moment, the only way to do that right now would be to go through all the property of the material and manually re-write the built-in node code to finally extend it manually or, having a way to get access to the current node builtin value output to chain it with another node for computation 100% of advanced users I know are using legacy code injections into built-in Threejs materials and made hundreds or thousands of materials that we could not port into TSL |
In this case, the injection could still be used with the Example: material.colorNode = materialColor;
material.opacityNode = materialOpacity;
material.metalnessNode = materialMetalness;
// .. This allows you to inject the properties defined in the Material at any time in the Node, for example: material.colorNode = hue( materialColor.rgb, time ); Basically it follows the same property name of the material as suffix. List of properties below: three.js/src/nodes/accessors/MaterialNode.js Lines 394 to 433 in 0c45156
|
That looks like the solution ! thanks a lot @sunag |
Hey @sunag In this material :
Observations : The normalNode rotates the normal, which works because we can see the shade rotating correctly The colorNode does not read the rotated normal ( if you uncomment the line 99 to rotate the colors, this would be the expected effect ) What is the relationship between materialNormal, and normalNode ? For an extend of a built-in material, I guess we should expect to modify the normalNode, and then this would extend to the other nodes built-in nodes, unless the normalNode is done on the fragment stage, which become expensive Is there a way to transform the normals on the vertex stage at geometry level, to be injected then on the built-in nodes in the fragment ? |
Like here, an example of rotating cubes, on which a directional light affect the coloring output, but the normals are wrong, since they are not rotated as well, Many thanks |
In this case, you would have to rotate the https://jsfiddle.net/qbck6Lg1/1/ material.positionNode = Fn(() => {
const pos = attribute('position', 'vec3').toVar();
const offset = attribute('offset', 'vec3').toVar();
const rotMtx = rotateY( time.add( hash( offset.x.add(offset.y.add(offset.z)))) );
normalLocal.assign( rotMtx.mul( normalLocal ) )
return rotMtx.mul(pos).add( offset );
})(); |
Hey @sunag Thanks a lot for the replies, the normalLocal trick is working I bumped into another extend problem here, this is an override of the fog_pars_fragment on the legacy renderer. This is a fog effect that turns the current color first into a ' fadeColor ' then transitionning to the fogColor, problem here, this is using the current gl_FragColor.rgb value The approach to achieve this would definitely be a fogNode, trouble here is ( unless I missed something ) I cannot get access to the " current color " affected into the output variable when it enters in the fogNode, is there any way to achieve this in the current system ?
|
Description
Using the legacy system for shaders, we could replace parts of the shading with some other parts :
Example :
With TSL (Three.js Shader Language), the challenge is: ( Unless I missed something, or could not find a solution on the wiki or examples )
I cannot find a way extend/modify an existing node after it's been processed by the material's built-in computations (lighting, textures etc.), the only way now is to override the colorNode entirely.
Two key technical limitations:
the node isn't available immediately on material creation
No built-in mechanism to stack/chain node operations
Solution
Potential solution approaches needed:
Additional context
No response
The text was updated successfully, but these errors were encountered: