-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[IR] Evaluate at #2017
Comments
I have a question about learnable parameters. If processed net supposed to be trainable, such an optimization can change training behaviour, here speculative example:
will become
This is not the same in the perspective of training. |
Yes. Constant folding is currently only used for inference models. For trainable models, it should be possible to mark parts of the model influenced by the trainable weights as not foldable. |
Is there a way to create initializers explicitly now? (I have searched up to ir._core.Graph.initializers) EDIT:
@justinchuby, is this code ok? |
Only the rewriter supports this, if that's what you meant |
Well, I meant both ways, but for now it is about rewriter. |
Yes. You can use the numpy array semantics for it |
Given an ir.Value, it would be nice to have a method to evaluate the constant value for it.
We can build the constant folder based on this, where we
cc @gramalingam
The text was updated successfully, but these errors were encountered: