Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added staticSingleSample flag, and optimize mesh attributes by default #989

Merged
merged 1 commit into from
Dec 16, 2020

Conversation

dgovil
Copy link
Collaborator

@dgovil dgovil commented Dec 9, 2020

This is a PR that adds an export optimization to convert attributes with single time samples to static values.

This is run by default for common mesh attributes as they have the benefit of reducing the complexity of the USD, especially for USD readers. For example, this will benefit round tripping USD files into maya since point cached meshes will be able to import their UVs if they're static.

For all other prim attributes, this is gated behind a new flag called staticSingleSamples. This is off by default to prevent any unwanted surprises if someone was expecting a single keyframe to persist through.

We find, with even the files generated by the included test, that a single attribute results in a 32 byte savings for usda file , and a 67 byte savings for a usdc file due to empty padding in usdc. This scales somewhat linearly with the complexity of the file, so could result in reasonable size reductions overall. As we generate lots of usdz files for distribution over the web, this flag is beneficial to us, and we think it would be beneficial for third parties as well.

@kxl-adsk kxl-adsk added the import-export Related to Import and/or Export label Dec 10, 2020
Copy link

@kxl-adsk kxl-adsk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea of converting into default time values makes sense to me. Keeping this as an option is a good call as well. Thank you.

@kxl-adsk
Copy link

kxl-adsk commented Dec 10, 2020

@dgovil I looked at the failures, it may be something on our end. I saw the same test (testUsdExportAsClip) failed yesterday for another PF. Let's ignore it for now while we investigate.

@kxl-adsk
Copy link

Actually, I just recalled that we have this test (testUsdExportAsClip) filtered out on other platforms. @mattyjams you mentioned it failing randomly for you guys too?

@mattyjams
Copy link
Contributor

Actually, I just recalled that we have this test (testUsdExportAsClip) filtered out on other platforms. @mattyjams you mentioned it failing randomly for you guys too?

Hmm. I know that there was fix made in core USD that addressed this for us, and I haven't seen it crop up again when running tests. That change should be in core USD 20.11:
PixarAnimationStudios/OpenUSD@5226235

@kxl-adsk
Copy link

Hmm. I know that there was fix made in core USD that addressed this for us, and I haven't seen it crop up again when running tests. That change should be in core USD 20.11:
PixarAnimationStudios/USD@5226235

We have adopted 20.11 some time ago. Something to monitor.

Copy link

@kxl-adsk kxl-adsk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dgovil I tested this change much more and it looks like it actually causes testUsdExportAsClip to fail all the time. Without your change, this test never failed in my local env. Now it's failing all the time. Please investigate.

@dgovil
Copy link
Collaborator Author

dgovil commented Dec 15, 2020

@kxl-adsk I'll take a look...do you have any failure details outside of what the build logs provide that might help me debug? It's odd that windows and mac are unaffected.

I'll see if i can spin up a linux VM to test it out on. What is the linux distro used here?

For reference (so people don't need to go log spelunking), this is the traceback from the logs

 Traceback (most recent call last):
      File "/home/S/jenkins/workspace/ecg-mayausd-branch-preflight-python3-linux/build/RelWithDebInfo/test/lib/usd/translators/testUsdExportAsClip.py", line 85, in testExportAsClip
        self._ValidateNumSamples(stage,'/world/pCube1', 'points',  1)
      File "/home/S/jenkins/workspace/ecg-mayausd-branch-preflight-python3-linux/build/RelWithDebInfo/test/lib/usd/translators/testUsdExportAsClip.py", line 45, in _ValidateNumSamples
        self.assertTrue(Gf.IsClose(attr.GetNumTimeSamples(), expectedNumSamples, 1e-6))
    AssertionError: False is not true

@kxl-adsk
Copy link

I was actually testing it on Windows...with different Maya and USD version. Failed for all of them.

Don't have the log at hand now...will post ASAP.

@dgovil
Copy link
Collaborator Author

dgovil commented Dec 15, 2020

Thanks. That might be something to be concerned about for the CI as well then since the CI logs say windows passed.

@kxl-adsk
Copy link

This particular test we have filtered out on CI for windows and macOS. It was failing randomly when we switched to run tests in parallel so we kept it off in there. Linux runs tests in serial, and this is why we never turned it off.

With comments from Matt (#989 (comment)), looks like we could try to re-enable this test now.

@kxl-adsk
Copy link

Verbose output of the test doesn't add any more data than what you already posted above, but just for sake of completeness here is what I get on Windows:

33: F
33: ======================================================================
33: FAIL: testExportAsClip (testUsdExportAsClip.testUsdExportAsClip)
33: ----------------------------------------------------------------------
33: Traceback (most recent call last):
33:   File "testUsdExportAsClip.py", line 85, in testExportAsClip
33:   File "testUsdExportAsClip.py", line 45, in _ValidateNumSamples
33: AssertionError: False is not true
33:
33: ----------------------------------------------------------------------
33: Ran 1 test in 1.113s
33:
33: FAILED (failures=1)
1/1 Test #33: testUsdExportAsClip ..............***Failed    5.92 sec

@dgovil
Copy link
Collaborator Author

dgovil commented Dec 15, 2020

Thanks. I can reproduce here now as well, when running the test individually.
I think it should be easy enough to update the test to accommodate the changes in this PR while keeping the behavior checks of the test itself.

I'll update this PR later this week with the changes.

@dgovil dgovil force-pushed the static_single_samples branch from d90f4fd to e0c5140 Compare December 15, 2020 05:40
@dgovil
Copy link
Collaborator Author

dgovil commented Dec 15, 2020

Hi @mattyjams , I'm just wondering if you can shed some light on what's going on here.
I've updated the test to expect 0 samples as would be the result of this PR's changes (since single timesamples are converted to static), but the resulting clips file is tripping up in the test, in a way that doesn't make sense to me.

testUsdExportAsClipOutput.zip

The specific error it's giving is for /world/pCube3 :: points:

# AssertionError: different values found on frame: 15
#                 non clip: [(-0.5, -0.5, 0.5), (0.5, -0.5, 0.5), (-0.5, 0.90590537, 0.5), (0.5, 0.5, 0.5), (-0.5, 0.5, -0.5), (0.5, 0.5, -0.5), (-0.5, -0.5, -0.5), (0.5, -0.5, -0.5)]
#                 clips:    [(-0.5, -0.5, 0.5), (0.5, -0.5, 0.5), (-0.5, 0.5, 0.5), (0.5, 0.5, 0.5), (-0.5, 0.5, -0.5), (0.5, 0.5, -0.5), (-0.5, -0.5, -0.5), (0.5, -0.5, -0.5)]

If I look at the result.usda , it shows that it switches from UsdExportAsClip_cube.010.usda to UsdExportAsClip_cube.015.usda at frame 15.

Both files should have identical values on that frame for pCube3 :: points:

[(-0.5, -0.5, 0.5), (0.5, -0.5, 0.5), (-0.5, 0.90590537, 0.5), (0.5, 0.5, 0.5), (-0.5, 0.5, -0.5), (0.5, 0.5, -0.5), (-0.5, -0.5, -0.5), (0.5, -0.5, -0.5)]

However if you look at the flattened.usda file, you can see that it introduces a frame 14.999999995559108 with the right value, then on frame 15 seems to use the value rom result.topology.usda instead of UsdExportAsClip_cube.015.usda. This is what seems to be causing the test to trip up.

As far as I can tell, the result.usda looks correct , and this looks like a libUSD bug to me? Would you be able to double check the attached files for me before I file an issue against USD?

@dgovil
Copy link
Collaborator Author

dgovil commented Dec 15, 2020

Okay received some clarifying information:

The default value you have for pCube3 in UsdExportAsClip_cube.015.usda is meaningless. Defaults are never read from clips: only time-sampled data is read. That’s why the result is falling back to result.topology.usda.

So what I think I'll do is remove the automatic optimization of mesh attributes from this PR and gate it behind the flag like everything else. That way, people who want it on can turn it on, but things like Clips do not break, since they expect the time sampled behaviour.

I'll update the PR in a bit.

@dgovil dgovil force-pushed the static_single_samples branch from e0c5140 to ceca05e Compare December 15, 2020 19:16
@dgovil
Copy link
Collaborator Author

dgovil commented Dec 15, 2020

Okay, updated the code and tests accordingly. Both the new test and the existing clips test pass locally (along with the other tests that weren't affected). 🤞🏽that the CI passes too.

@dgovil
Copy link
Collaborator Author

dgovil commented Dec 15, 2020

@kxl-adsk sorry, i just saw the CI clang format issue on the comment line. I pushed a change to fix that now.

Adjust testUsdExportAsClip to handle single timesamples being converted to static values
@dgovil dgovil force-pushed the static_single_samples branch from c7e78e9 to cd75434 Compare December 15, 2020 19:28
@kxl-adsk kxl-adsk added the ready-for-merge Development process is finished, PR is ready for merge label Dec 16, 2020
@kxl-adsk kxl-adsk merged commit e087b8d into Autodesk:dev Dec 16, 2020
@@ -0,0 +1,64 @@
#!/pxrpythonsubst
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry @dgovil, one more thing for you. :)

I think this newly added test got left out of the CMakeLists.txt?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you're right, and thanks for catching that. I'd missed cherry picking that out of our internal repo.
Fixed in: #1012

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would like us to have more visibility on code coverage. This way we could automatically detect if PR is lowering the coverage.

Of course, this should have been caught easily by checking logs, but I would like to eliminate possibility for errors and have some metrics to indicate how well pull request changes are tested.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah that's a good idea. It would have to be two fold though, because in this case the test existed so coverage tools might erroneously consider it covered?

Perhaps a test to see if new test files are covered in the CMakeList, since it's an easy mistake to make would help? I assume the project prefers not to just Glob all the tests in the dir right?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have a look for example at lcov (and gcov) to get a better sense of what type of information you can acquire from your test suit. And maybe it wasn’t explicit, but test actually have to run in order to report coverage for new code blocks/methods.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I misunderstood what you were saying. Yes that makes sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
import-export Related to Import and/or Export ready-for-merge Development process is finished, PR is ready for merge
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants