Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GLTFLoader: Remove dependency on 'THREE.DRACOLoader' global #15943

Merged
merged 2 commits into from
Mar 11, 2019

Conversation

donmccurdy
Copy link
Collaborator

@donmccurdy donmccurdy commented Mar 11, 2019

Rolls back #15177.

I think it's better to avoid a dependency on the global THREE.DRACOLoader namespace. It will be an issue for our ES module refactoring (GLTFLoader can accept a DRACOLoader instance as a parameter, but doesn't depend on it directly) and it prevents users from providing a custom DRACOLoader instance, per #15249 and google/model-viewer#72.

Fortunately, an easy workaround for users is to call THREE.DRACOLoader.getDecoderModule() in their own code when they know Draco is needed, which actually performs even better (it doesn't have to wait for the JSON header). I've documented this accordingly.

/cc @TyLindberg

@donmccurdy donmccurdy changed the title Donmccurdy gltfloader draco global GLTFLoader: Remove dependency on 'THREE.DRACOLoader' global Mar 11, 2019
@mrdoob mrdoob added this to the r103 milestone Mar 11, 2019
@mrdoob mrdoob merged commit bee2b8a into dev Mar 11, 2019
@mrdoob
Copy link
Owner

mrdoob commented Mar 11, 2019

Thanks!

@donmccurdy donmccurdy deleted the donmccurdy-gltfloader-draco-global branch March 11, 2019 23:02
@TyLindberg
Copy link
Contributor

Makes sense to me!

My initial thought when writing the PR was that it could improve loading behavior for both large models that use DRACO and small models that don't use DRACO. My concern was for scenarios where it is unknown whether loaded models will use DRACO compression or not. In those environments, downloading the decoder module for a small model that doesn't use DRACO might exceed the desired network bandwidth.

I imagine it's a smaller use case, so allowing for ES modules and custom DRACO loaders definitely takes priority. However, I'd be interested if you had any thoughts on how we could potentially still get that improved loading behavior without having to indiscriminately preload the decoding module?

@donmccurdy
Copy link
Collaborator Author

My concern was for scenarios where it is unknown whether loaded models will use DRACO compression or not.

The main scenario like that, that I can think of anyway, is where the user provides a model interactively like in a drag-and-drop viewer. In that case I don't think latency is so sensitive that downloading the Draco library the first time is a problem.

However, I'd be interested if you had any thoughts on how we could potentially still get that improved loading behavior without having to indiscriminately preload the decoding module?

Once #15249 or something like it gets merged, we may be able to again. I think we'd need getDecoderModule to be an instance method, which isn't compatible with the current DRACOLoader but works with the proposal there.

@TyLindberg
Copy link
Contributor

The main scenario like that, that I can think of anyway, is where the user provides a model interactively like in a drag-and-drop viewer. In that case I don't think latency is so sensitive that downloading the Draco library the first time is a problem.

Yeah, that makes sense. The other scenario I've experienced is a general 3D model viewer that determines the model to load based on a configuration JSON file. In that case, there are other methods for determining the compression on the model beforehand, but it will generally be a bit more involved than a solution that lives within the GLTFLoader.

Once #15249 or something like it gets merged, we may be able to again. I think we'd need getDecoderModule to be an instance method, which isn't compatible with the current DRACOLoader but works with the proposal there.

Cool, that sounds like a good plan to me! I'll make sure to keep an eye on that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants