-
Notifications
You must be signed in to change notification settings - Fork 253
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug fix: Include forward_args in self.model function calls in intergrated_gradients.py #525
Conversation
Bug report + fix: forward_kwargs not used in all self.model calls in integrated_gradients SeldonIO#524
@sanjass the tests are failing because we also pass in alibi/alibi/explainers/integrated_gradients.py Lines 143 to 146 in 6100135
Given the repetition I think it would be good to split this out into a small utility function (or actually, would be good to check if we could just substitute |
replaced forward_kwargs with {} when it's None
I see, I pushed a change as follows:
and replaced Didn't get to run the checks though |
replacing forward_kwargs None with {} in all functions
Thanks! Changes look good, I will rerun the IG notebooks (especially the transformers one) before merging. |
added missing check " if forward_kwargs is None: forward_kwargs = {}"
Codecov Report
@@ Coverage Diff @@
## master #525 +/- ##
==========================================
- Coverage 82.34% 82.32% -0.03%
==========================================
Files 76 76
Lines 10334 10350 +16
==========================================
+ Hits 8510 8521 +11
- Misses 1824 1829 +5
|
@sanjass just a suggestion, if you also do |
Yeah, did something like that but my previous fix introduced a different flake8 error and I didn't double check lol. Also I'm editing in the github editor directly. Anyways hopefully this finally works |
@sanjass thank you for your contribution! And thanks for the extra review @RobertSamoilescu! |
Bug report + fix:
forward_kwargs
not used in allself.model
calls inintegrated_gradients.py
#524