Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix bug in validate_inputs #793

Merged
merged 15 commits into from
Jan 25, 2023
Merged

fix bug in validate_inputs #793

merged 15 commits into from
Jan 25, 2023

Conversation

chlebowa
Copy link
Contributor

@chlebowa chlebowa commented Jan 2, 2023

validate_inputs captures messages even if validator is not enabled
this change rectifies it

@chlebowa chlebowa marked this pull request as draft January 2, 2023 16:17
@github-actions
Copy link
Contributor

github-actions bot commented Jan 2, 2023

Unit Tests Summary

    1 files    13 suites   11s ⏱️
150 tests 150 ✔️ 0 💤 0
295 runs  295 ✔️ 0 💤 0

Results for commit c523feb.

♻️ This comment has been updated with latest results.

@github-actions
Copy link
Contributor

github-actions bot commented Jan 2, 2023

badge

Code Coverage Summary

Filename                         Stmts    Miss  Cover    Missing
-----------------------------  -------  ------  -------  ------------------------------
R/dummy_functions.R                 74      61  17.57%   12-95
R/example_module.R                  18       9  50.00%   23-26, 29-33
R/get_rcode_utils.R                 52       2  96.15%   94, 99
R/get_rcode.R                      137      54  60.58%   74, 81, 86, 211-277
R/include_css_js.R                  24       0  100.00%
R/init.R                            22       2  90.91%   188-189
R/module_nested_tabs.R             130       7  94.62%   57, 96, 101-102, 148, 198, 227
R/module_tabs_with_filters.R        68       1  98.53%   162
R/module_teal_with_splash.R         33       2  93.94%   65, 77
R/module_teal.R                    111       5  95.50%   49, 52, 155-156, 180
R/modules_debugging.R               18      18  0.00%    37-56
R/modules.R                        101       8  92.08%   341-366
R/reporter_previewer_module.R       12       2  83.33%   18, 22
R/show_rcode_modal.R                20      20  0.00%    17-38
R/tdata.R                           41       2  95.12%   146, 172
R/utils.R                           13       0  100.00%
R/validate_inputs.R                 32       0  100.00%
R/validations.R                     62      39  37.10%   107-368
R/zzz.R                             11       7  36.36%   3-14
TOTAL                              979     239  75.59%

Diff against main

Filename               Stmts    Miss  Cover
-------------------  -------  ------  --------
R/validate_inputs.R      +11       0  +100.00%
TOTAL                    +11       0  +0.28%

Results for commit: e9b0ee2

Minimum allowed coverage is 80%

♻️ This comment has been updated with latest results

Copy link
Contributor

@nikolas-burkoff nikolas-burkoff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a test or two please :)

@nikolas-burkoff nikolas-burkoff self-assigned this Jan 3, 2023
Aleksander Chlebowski added 2 commits January 3, 2023 13:08
Copy link
Contributor

@nikolas-burkoff nikolas-burkoff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd adapt the test so that as well as there being warnings, it checks that you don't get a shiny validate error in the case that everything is good in all non-disabled validators

But yup looks good

@@ -108,6 +108,11 @@ validate_inputs <- function(..., header = "Some inputs require attention") {
lapply(vals, checkmate::assert_class, "InputValidator")
checkmate::assert_string(header, null.ok = TRUE)

if (!all(vapply(vals, validator_enabled, logical(1L)))) {
warning("Some validators are disabled and will be omitted.", call. = TRUE)
Copy link
Contributor

@nikolas-burkoff nikolas-burkoff Jan 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we use a WARN from teal.logger here? an R warning may be a bit too much - but I'll let you decide

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You probably should stick to just one logging channel. Either use the built-in R messages and warnings or use the logger solution.

Copy link
Contributor

@kpagacz kpagacz Jan 6, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reasoning is that logger outputs can be piped to a file without R warnings. If someone was to do it, would they care about this warning?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

teal.logger it is then. Any advice on how to write unit tests for it?

Copy link
Contributor

@gogonzo gogonzo Jan 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've checked this issue here and situation is not that simple for me, so let me know if you have some solution or a strong opinion about how to handle this situation:

It's not straightforward how to test modules which return warnings through the logger. options(TEAL.LOG_LEVEL) + capture_output in tests doesn't seem to be optimal solution. I've checked there is no function in the logger to "convert" logger-conditions into the exceptions.

With @chlebowa we tried to find the way to temporary (for tests) turn logger entries into R errors, warnings, messages and it's also not so easy neither. Currently, it's possible to do access level and msg in layout_teal_glue_generator via register_logger which would have to be called once again especially for tests.

Alternative solution (not for today) is to use logger::log_errors, logger::log_warnings and change all logger::log_warn in teal packages to generic warning. How would it work then? - log_errors uses withGlobalCallingHandlers and intercept errors (their messages) and append them into logger. Our modules then can work normally with warning, stop and they could be appended into logger if one adds logger::log_errors on top of the app.R file.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As this is not a trivial issue, I have modified the tests to test console output as a temporary measure.

@chlebowa
Copy link
Contributor Author

chlebowa commented Jan 3, 2023

I'd adapt the test so that as well as there being warnings, it checks that you don't get a shiny validate error in the case that everything is good in all non-disabled validators

But yup looks good

So the test for validate_inputs has only one validator but the one for validate_inputs_segregated has one validator enabled and another one disabled, both passing. Is this what you mean?

I should probably add this to the former, too.

@chlebowa chlebowa marked this pull request as ready for review January 5, 2023 14:25
Aleksander Chlebowski added 2 commits January 9, 2023 11:09
Rewrites `validate_inputs*` functions into a single function.
`validate_inputs` will accept an arbitrary number of validators passed
directly or as a nested list. Lists are processed recursively.
@chlebowa chlebowa removed the blocked label Jan 25, 2023
Copy link
Contributor

@nikolas-burkoff nikolas-burkoff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few small comments and a bit more work needed on the tests I think

@chlebowa chlebowa merged commit 3188157 into main Jan 25, 2023
@chlebowa chlebowa deleted the fix_validate_inputs@main branch January 25, 2023 15:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants