-
-
Notifications
You must be signed in to change notification settings - Fork 35.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WebGL2] Texture Internal Format discussion #14625
Comments
I'm thinking of adding
|
That works the same, so I am fine with it! If you want I can separate my code to a commit that handle that, I already changed WebGLTextures. |
Nice, PR is welcome. |
Continuing texture discussion from this comment in #13717 ... @yoshikiohshima the desire for utilizing R16UI and R16I texture formats is that most medical images (CT, MR, X-Ray, etc) are signed/unsigned 16 bit images (well actually they are anywhere from 10-16 bits but required 2 bytes of data per pixel). Hence I would like to pass in the images to my shader for rendering using these texture formats. Currently I encode the 2 bytes of data into 2 of the channels in an RGB texture format and then reconstruct the 16 bit values in the shader. This works fine but it would be nice to reduce my texture memory footprint by 1/3 by utilizing these 2 byte texture formats directly. |
Did not have time to work on it sorry, I was attending the SIGGRAPH. I implemented that on my own fork already, I just have to separate the work to make a PR on three.js, not the most fun work but I have to do it. |
Instead of adding another parameter to the constructor, I would add a new method: texture.setInternalFormat( THREE.RGBA8UIFormat ); |
Because we lean toward |
Yeah. That kind of API tends to be more resilient to API changes. |
I see. Until we move to such style APIs, only custom internal format needs to be setup via |
Any updates? @DavidPeicho |
Sorry, I was in long holidays. It will be done this week end. @takahirox |
Hey guys, just wondering if there is anything happening in regards to this issue ... we would really like to start taking advantage of this capability once implemented ... thx? |
Hey, I added the internal format support in a PR. You can follow the status here: #15121 |
I would like to open a discussion about internal format, and how we could integrate them without breaking the current Three.js API.
So, right now, the internal format is derived from the format, like this.
What I did on my side, is to add an attribute inside the
Texture
class, which represents the internal format, but is not given through the constructor, but rather explicitely:I also added support for unsgined integer sampler, etc... And everything seems to work great.
What do you think about this change? I agree it's not the best regarding the API, but at least it backward compatible with WebGL1.
The text was updated successfully, but these errors were encountered: