-
-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix bug in validate_inputs
#793
Conversation
Code Coverage Summary
Diff against main
Results for commit: e9b0ee2 Minimum allowed coverage is ♻️ This comment has been updated with latest results |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add a test or two please :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd adapt the test so that as well as there being warnings, it checks that you don't get a shiny validate error in the case that everything is good in all non-disabled validators
But yup looks good
R/validate_inputs.R
Outdated
@@ -108,6 +108,11 @@ validate_inputs <- function(..., header = "Some inputs require attention") { | |||
lapply(vals, checkmate::assert_class, "InputValidator") | |||
checkmate::assert_string(header, null.ok = TRUE) | |||
|
|||
if (!all(vapply(vals, validator_enabled, logical(1L)))) { | |||
warning("Some validators are disabled and will be omitted.", call. = TRUE) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we use a WARN from teal.logger here? an R warning may be a bit too much - but I'll let you decide
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You probably should stick to just one logging channel. Either use the built-in R messages and warnings or use the logger solution.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reasoning is that logger outputs can be piped to a file without R warnings. If someone was to do it, would they care about this warning?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
teal.logger
it is then. Any advice on how to write unit tests for it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've checked this issue here and situation is not that simple for me, so let me know if you have some solution or a strong opinion about how to handle this situation:
It's not straightforward how to test modules which return warnings through the logger
. options(TEAL.LOG_LEVEL)
+ capture_output
in tests doesn't seem to be optimal solution. I've checked there is no function in the logger
to "convert" logger-conditions into the exceptions.
With @chlebowa we tried to find the way to temporary (for tests) turn logger entries into R errors, warnings, messages and it's also not so easy neither. Currently, it's possible to do access level
and msg
in layout_teal_glue_generator via register_logger
which would have to be called once again especially for tests.
Alternative solution (not for today) is to use logger::log_errors
, logger::log_warnings
and change all logger::log_warn
in teal packages to generic warning
. How would it work then? - log_errors
uses withGlobalCallingHandlers
and intercept errors (their messages) and append them into logger. Our modules then can work normally with warning
, stop
and they could be appended into logger if one adds logger::log_errors
on top of the app.R
file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As this is not a trivial issue, I have modified the tests to test console output as a temporary measure.
So the test for I should probably add this to the former, too. |
Rewrites `validate_inputs*` functions into a single function. `validate_inputs` will accept an arbitrary number of validators passed directly or as a nested list. Lists are processed recursively.
Signed-off-by: Aleksander Chlebowski <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few small comments and a bit more work needed on the tests I think
validate_inputs
captures messages even if validator is not enabledthis change rectifies it