Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testthat #40

Closed
wants to merge 71 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
71 commits
Select commit Hold shift + click to select a range
873a23e
import {tibble}, {pillar} and {vctrs}
GregorDeCillia Sep 27, 2022
bc98e1d
+ custom vector class for schema uris
GregorDeCillia Sep 27, 2022
fd22f68
Merge branch 'master' into tibble_pkg
GregorDeCillia Sep 27, 2022
1ca4d78
sc_table_saved: normalize uri
GregorDeCillia Sep 27, 2022
88d0173
update namespaces
GregorDeCillia Sep 27, 2022
a348d58
update language param to sc_headers(), sc_schema_catalogue()
GregorDeCillia Sep 27, 2022
3b539eb
update links in docs
GregorDeCillia Sep 27, 2022
ff8ad8a
prep NEWS for v0.5.1
GregorDeCillia Sep 27, 2022
37d8460
don't use ide:run in docs
GregorDeCillia Sep 27, 2022
0147dc5
add clickable links to print.sc_schema()
GregorDeCillia Sep 27, 2022
cc318c9
mention COUNTs in docs for sc_table_custom()
GregorDeCillia Sep 30, 2022
2a7a963
document error: cell limit exceeded (400)
GregorDeCillia Sep 30, 2022
876450d
typo: sc_table_ciustom() -> sc_table_custom()
GregorDeCillia Sep 30, 2022
c530abf
add gallery of german example datasets
GregorDeCillia Sep 30, 2022
5a08065
add helper function for user agent
GregorDeCillia Sep 30, 2022
3eee18f
OGD: import de_desc and en_desc
GregorDeCillia Nov 21, 2022
ff13176
v0.5.0.1, update NEWS
GregorDeCillia Nov 23, 2022
e2b7c74
update STATcube links
GregorDeCillia Nov 24, 2022
e93560b
no @internal in sc_table_custom()
GregorDeCillia Nov 24, 2022
341254d
allow recodes in sc_table_custom()
GregorDeCillia Nov 25, 2022
0dac8e8
extend custom tables article with recodes
GregorDeCillia Dec 9, 2022
8d59925
add typechecks to sc_table_custom()
GregorDeCillia Dec 9, 2022
f157a40
require pillar 1.5.0
GregorDeCillia Dec 9, 2022
a0fbe4c
import tibble generics via @import
GregorDeCillia Dec 9, 2022
dc009c9
prep NEWS and README for upcoming release
GregorDeCillia Dec 9, 2022
6b63a60
allow json strings in sc_table()
GregorDeCillia Dec 16, 2022
d7e0833
improve print method for OGD resouces
GregorDeCillia Dec 16, 2022
38db405
cistomize print() for sc_schema_flatten()
GregorDeCillia Dec 16, 2022
5aa847a
devtools::document()
GregorDeCillia Dec 16, 2022
3fb82be
sc_table_custom(dry_run)
GregorDeCillia Dec 17, 2022
7ce363f
update custom tables article
GregorDeCillia Dec 17, 2022
4da0e91
fix usuage example
GregorDeCillia Dec 17, 2022
8568b1e
add "further reading" too sc_schema
GregorDeCillia Dec 17, 2022
079159d
+ crossreferences last_error, table_custom
GregorDeCillia Dec 18, 2022
694bc10
sc_table_custom.Rmd: show json for all requests
GregorDeCillia Dec 18, 2022
e0a1993
add more links to sc_table_custom.Rmd
GregorDeCillia Dec 18, 2022
00312de
update <h2> ids in sc_last_error.Rmd
GregorDeCillia Dec 18, 2022
adc3a83
mention #35 in NEWS
GregorDeCillia Dec 18, 2022
8701f0f
add support for iso dates
GregorDeCillia Dec 30, 2022
89755b4
v0.5.1
GregorDeCillia Jan 9, 2023
3432712
CI/CD: deploy dev branch
GregorDeCillia Jan 30, 2023
ea6bf24
upodate jenkins config
GregorDeCillia Jan 30, 2023
c49a649
devtools::spell_check()
GregorDeCillia Feb 20, 2023
33f3cf2
Merge branch 'tibble_pkg' of github.com:statistikat/STATcubeR into ti…
GregorDeCillia Feb 20, 2023
bff2954
Merge pull request #38 from statistikat/master
GregorDeCillia Feb 20, 2023
8b32c88
don't use data.frame()
GregorDeCillia Feb 23, 2023
93e74a1
forward server argument to sc_key()
GregorDeCillia Feb 27, 2023
d2d9992
fix printing of NAs in timestamps
GregorDeCillia Feb 27, 2023
bed0b8e
skip example if no key available
GregorDeCillia Feb 27, 2023
908ce3b
drop names in sc_recode
GregorDeCillia Feb 27, 2023
a629296
don't drop unused levels in ogd fields
GregorDeCillia Feb 27, 2023
d3fbd98
add print method fror ogd_id
GregorDeCillia Feb 27, 2023
b2fbe08
use "ogd_id" class in od_list()
GregorDeCillia Feb 27, 2023
47ddfcc
show progress in od_catalogue()
GregorDeCillia Feb 27, 2023
dcdd0c0
use ogd_id class in catalogue
GregorDeCillia Feb 27, 2023
cc90ae3
drop unused parameter
GregorDeCillia Feb 27, 2023
b09b4cd
simplify pkgdown setup
GregorDeCillia Feb 27, 2023
e23601c
add custom color palette for @examples
GregorDeCillia Feb 27, 2023
e95f5e4
use target=_blank for statcube links
GregorDeCillia Feb 27, 2023
737c1b4
build external links with devtools::document()
GregorDeCillia Feb 27, 2023
ae80404
allow node-types in sc_schema_flatten()
GregorDeCillia Feb 28, 2023
1567f61
check arg 'type' in sc_schema_flatten()
GregorDeCillia Feb 28, 2023
f7bdba4
+ sentence on empty folders in sc_schema()
GregorDeCillia Feb 28, 2023
72091bc
update sc_example("foregirn_trade.json")
GregorDeCillia Mar 1, 2023
a3f03a9
Merge branch 'tibble_pkg'
GregorDeCillia Mar 1, 2023
a6cef70
use output, not message in print method
GregorDeCillia Mar 1, 2023
99659dc
add url for bug reports
GregorDeCillia Mar 1, 2023
676507d
add unit tests
GregorDeCillia Mar 1, 2023
02e13f0
pass language to sc_table costructor
GregorDeCillia Mar 2, 2023
19c5bfa
add httptest to check error handling
GregorDeCillia Mar 2, 2023
b451ba6
fix: avoid positional argument
GregorDeCillia Mar 2, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 13 additions & 6 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Type: Package
Package: STATcubeR
Title: R interface for the STATcube REST API and Open Government Data
Version: 0.5.0
Version: 0.5.1
Authors@R: c(
person("Gregor", "de Cillia", , "[email protected]", role = c("aut", "cre")),
person("Bernhard", "Meindl", , "[email protected]", role = "ctb"),
Expand All @@ -14,17 +14,24 @@ Description: Import data from the STATcube REST API or from the open data
License: GPL (>= 2)
URL: https://statistikat.github.io/STATcubeR,
https://github.com/statistikat/STATcubeR
BugReports: https://github.com/statistikat/STATcubeR/issues
Imports:
cli (>= 3.4.1),
httr,
jsonlite,
magrittr
Suggests:
magrittr,
pillar (>= 1.5.0),
vctrs
Suggests:
spelling,
data.tree,
pillar,
rappdirs,
xml2
xml2,
testthat (>= 3.0.0),
httptest
Encoding: UTF-8
LazyData: true
Roxygen: list(markdown = TRUE)
RoxygenNote: 7.2.1
RoxygenNote: 7.2.3
Language: en-US
Config/testthat/edition: 3
15 changes: 15 additions & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,27 @@

S3method(as.character,od_json)
S3method(as.character,sc_json)
S3method(as.character,sc_schema_uri)
S3method(as.data.frame,sc_data)
S3method(format,pillar_shaft_ogd_file)
S3method(format,sc_schema_uri)
S3method(pillar_shaft,ogd_file)
S3method(pillar_shaft,sc_dttm)
S3method(pillar_shaft,sc_schema_type)
S3method(pillar_shaft,sc_schema_uri)
S3method(print,od_cache_file)
S3method(print,od_json)
S3method(print,od_revisions)
S3method(print,od_table)
S3method(print,sc_rate_limit_table)
S3method(print,sc_schema)
S3method(print,sc_schema_flatten)
S3method(print,sc_table)
S3method(print,sc_tibble_meta)
S3method(print,sc_url)
S3method(tbl_format_footer,sc_meta)
S3method(tbl_sum,sc_meta)
S3method(tbl_sum,sc_tibble)
export("%>%")
export(od_cache_clear)
export(od_cache_dir)
Expand Down Expand Up @@ -53,6 +64,7 @@ export(sc_last_error_parsed)
export(sc_rate_limit_schema)
export(sc_rate_limit_table)
export(sc_rate_limits)
export(sc_recode)
export(sc_schema)
export(sc_schema_catalogue)
export(sc_schema_db)
Expand All @@ -65,3 +77,6 @@ export(sc_tabulate)
importFrom(magrittr,"%<>%")
importFrom(magrittr,"%>%")
importFrom(magrittr,"%T>%")
importFrom(pillar,pillar_shaft)
importFrom(pillar,tbl_format_footer)
importFrom(pillar,tbl_sum)
13 changes: 11 additions & 2 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,12 @@
# STATcubeR 0.6.0

* Update print methods with the `{tibble}` package (#32)
* Add filters and other recodes to `sc_table_custom()` (#33)
* Add global option `STATcubeR.language` to override the default language
* `od_table()`: Add descriptions to `x$header` and `x$field(i)`
* Depend on cli >= 3.4.1 (@matmo, #35)
* Allow json strings in `sc_table()` (@matmo, #36)

# STATcubeR 0.5.0

* adapt `od_list()` to data.statistik.at update ([`2249b66`](https://github.com/statistikat/STATcubeR/commit/2249b6607cb822a4aac56c6258cbe967832171f1))
Expand Down Expand Up @@ -38,7 +47,7 @@
* Allow recodes of `sc_data` objects (#17)
* Better parsing of time variables (#15, #16)
* Use bootstrap 5 and `{pkgdown}` 2.0.0 for the website
* Allow export and import of open data using tar archves (#20)
* Allow export and import of open data using tar archives (#20)

# STATcubeR 0.2.4

Expand Down Expand Up @@ -85,7 +94,7 @@ This version finalizes #11
https://data.statistik.gv.at/

* new class `od_table` to get OGD data
* methods to tabulate reponses
* methods to tabulate responses
* caching
* four new pkgdown articles for `od_table()`, `od_list()`, `od_resource()` and `sc_data`

Expand Down
4 changes: 2 additions & 2 deletions R/browse.R
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ sc_browse <- function(server = "ext") {
sc_url(sc_url_gui(server), "home")
}

#' @describeIn sc_browse opens the preference menu with the api key
#' @describeIn sc_browse opens the preference menu with the API key
#' @examples
#' sc_browse_preferences()
#' @export
Expand Down Expand Up @@ -75,7 +75,7 @@ in_stat <- function() {
}

sc_url_gui <- function(server = "ext") {
if (server == "ext" && !in_stat())
if (server == "ext" && (!in_stat() || Sys.getenv("NOT_CRAN") != ""))
return("https://portal.statistik.at/statistik.at/ext/statcube/")
if (server == "test")
return("http://sdbtest:8081/statistik.at/wdev/statcube/")
Expand Down
8 changes: 4 additions & 4 deletions R/cache.R
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
#' Caching can be set up using environment variables. To set up a persistent cache
#' for both Open Data and the REST API, the following lines in `.Renviron` can
#' be used.
#' The paths in this example are only applicalble for UNIX-based operating systems.
#' The paths in this example are only applicable for UNIX-based operating systems.
#'
#' ```sh
#' STATCUBE_KEY_EXT = YOUR_API_KEY_GOES_HERE
Expand All @@ -23,7 +23,7 @@
#' Caching is not implemented for the
#' endpoints [sc_info()] and [sc_rate_limit_table()].
#' @rdname sc_cache
#' @param verbose print instuctions on how to set up caching persistently
#' @param verbose print instructions on how to set up caching persistently
#' via environment variables?
#' @name sc_cache
NULL
Expand All @@ -49,14 +49,14 @@ sc_cache_disable <- function() {
Sys.unsetenv("STATCUBE_CACHE")
}

#' @describeIn sc_cache informs wether the cache is currently enabled
#' @describeIn sc_cache informs whether the cache is currently enabled
#' @export
sc_cache_enabled <- function() {
Sys.getenv("STATCUBE_CACHE") != ""
}

#' @export
#' @param dir a chace directory
#' @param dir a cache directory
#' @describeIn sc_cache get/set the directory used for caching
sc_cache_dir <- function(dir = NULL) {
if (is.null(dir))
Expand Down
6 changes: 3 additions & 3 deletions R/error.R
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
#' Error handling for the STATcube REST API
#'
#' @description
#' In case API requests are unsuccessfull, `STATcubeR` will throw errors
#' In case API requests are unsuccessful, `STATcubeR` will throw errors
#' to summarize the httr error type and its meaning.
#' Requests are considered unsuccessfull if one of the following applies
#' Requests are considered unsuccessful if one of the following applies
#' * The response returns `TRUE` for `httr::http_error()`.
#' * The response is not of type `"application/json"`
#'
#' In some cases it is useful to get direct access to a faulty response object.
#' For that purpose, it is possible to use [sc_last_error()] which will provide
#' the httr response object for the last unsuccessfull request.
#' the httr response object for the last unsuccessful request.
#' @return The return value from `httr::GET()` or `httr::POST()`.
#' @examplesIf sc_key_exists()
#' try(sc_table_saved("invalid_id"))
Expand Down
2 changes: 1 addition & 1 deletion R/key.R
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ sc_key_set <- function(key, server = "ext", test = TRUE) {
#' an error is thrown.
#' @export
sc_key_get <- function(server = "ext") {
if (!sc_key_exists())
if (!sc_key_exists(server))
stop("No STATcube key available. Set key with sc_key_set()")
invisible(Sys.getenv(sc_key_env_var(server)))
}
Expand Down
15 changes: 8 additions & 7 deletions R/od_cache.R
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
#' od_downloads()
#' @details
#' [od_cache_summary()] provides an overview of all contents of the cache through
#' a data.frame. It hasone row for each dataset and the following columns.
#' a data.frame. It has one row for each dataset and the following columns.
#' All file sizes are given in bytes
#' - **`id`** the dataset id
#' - **`updated`** the last modified time for `${id}.json`
Expand All @@ -39,7 +39,7 @@ od_cache_summary <- function(server = "ext") {
field <- substr(files[is_field], 1 + pos_underscore[is_field], nchar(files[is_field]) - 4)
id <- substr(files[is_field], 1, pos_underscore[is_field] - 1)
sizes_fields <- file.size(file.path(od_cache_dir(), files[is_field])) %>% split(id) %>% sapply(sum)
fields <- data.frame(id, field, stringsAsFactors = FALSE)
fields <- list(id = id, field = field)

files <- files[!is_field]
pos_underscore <- as.integer(gregexpr("_HEADER", files))
Expand All @@ -48,17 +48,18 @@ od_cache_summary <- function(server = "ext") {
files <- files[!is_header]
id_data <- substr(files, 1, nchar(files) - 4)
all_ids <- unique(c(id_data, id_header, fields$id))
data.frame(
id = all_ids,
res <- data_frame(
id = all_ids %>% `class<-`(c("ogd_id", "character")),
updated = file.mtime(paste0(cache_dir, all_ids, ".json")),
json = file.size(paste0(cache_dir, all_ids, ".json")),
data = file.size(paste0(cache_dir, all_ids, ".csv")),
header = file.size(paste0(cache_dir, all_ids, "_HEADER.csv")),
fields = sizes_fields[match(unique(fields$id), all_ids)],
n_fields = match(fields$id, all_ids) %>% factor(seq_along(all_ids)) %>%
table() %>% as.integer(),
row.names = NULL, stringsAsFactors = FALSE
) %>% `class<-`(c("tbl", "data.frame"))
table() %>% as.integer()
)
class(res$updated) <- c("sc_dttm", class(res$updated))
res
}


Expand Down
32 changes: 20 additions & 12 deletions R/od_list.R
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
#' [od_list()] returns a `data.frame ` containing all datasets published at
#' [data.statistik.gv.at](https://data.statistik.gv.at)
#'
#' @param unique some datasets are pulbished under multiple groups.
#' @param unique some datasets are published under multiple groups.
#' They will only be listed once with the first group they appear in unless
#' this parameter is set to `FALSE`.
#' @param server the open data server to use. Either `ext` for the external
Expand Down Expand Up @@ -43,11 +43,10 @@ od_list <- function(unique = TRUE, server = c("ext", "red")) {
xml2::xml_find_all(".//a")

# ids
df <- data.frame(
category = "NA",
df <- data_frame(
category = rep("NA", length(el)),
id = el %>% xml2::xml_attr("aria-label"),
label = el %>% xml2::xml_text(),
stringsAsFactors = FALSE
label = el %>% xml2::xml_text()
)

ignored_labels <- c("[Alle \u00f6ffnen]", "[Alle schlie\u00dfen]",
Expand All @@ -67,7 +66,9 @@ od_list <- function(unique = TRUE, server = c("ext", "red")) {
df <- df[!(df$id %in% od_resource_blacklist), ]
rownames(df) <- NULL
attr(df, "od") <- r$times[["total"]]
df %>% `class<-`(c("tbl", "data.frame"))
class(df$id) <- c("ogd_id", "character")
class(df) <- c("tbl_df", class(df))
df
}

#' Get a catalogue for OGD datasets
Expand Down Expand Up @@ -95,7 +96,7 @@ od_list <- function(unique = TRUE, server = c("ext", "red")) {
#' |json |`list<od_json>`| Full json metadata
#'
#' The type `datetime` refers to the `POSIXct` format as returned by [Sys.time()].
#' The last column `"json"` containes the full json metadata as returned by
#' The last column `"json"` contains the full json metadata as returned by
#' [od_json()].
#'
#' @inheritParams od_table
Expand All @@ -120,7 +121,15 @@ od_catalogue <- function(server = "ext", local = TRUE) {
ids <- od_revisions(server = server)
}
timestamp <- switch(as.character(local), "TRUE" = NULL, "FALSE" = Sys.time())
jsons <- lapply(ids, od_json, timestamp, server)
jsons <- lapply(
cli::cli_progress_along(
ids, type = "tasks", "downloading json metadata files"),
function(i) {
od_json(ids[i], timestamp, server)
}
)
if (!local)
cli::cli_text("\rDownloaded {.field {length(ids)}} metadata files with {.fn od_json}")
as_df_jsons(jsons)
}

Expand All @@ -130,7 +139,7 @@ as_df_jsons <- function(jsons) {
}

descs <- sapply(jsons, function(x) x$extras$attribute_description) %>% paste0(";", .)
out <- data.frame(
out <- data_frame(
title = sapply(jsons, function(x) x$title),
measures = gregexpr(";F-", descs) %>% sapply(length),
fields = gregexpr(";C-", descs) %>% sapply(length),
Expand All @@ -145,12 +154,11 @@ as_df_jsons <- function(jsons) {
update_frequency = sapply(jsons, function(x) x$extras$update_frequency),
tags = I(lapply(jsons, function(x) unlist(x$tags))),
categorization = sapply(jsons, function(x) unlist(x$extras$categorization[1])),
json = I(jsons),
stringsAsFactors = FALSE
json = I(jsons)
)
out$modified <- parse_time(out$modified)
out$created <- parse_time(out$created)
class(out) <- c("tbl", class(out))
class(out$id) <- c("ogd_id", "character")
out
}

14 changes: 8 additions & 6 deletions R/od_resource.R
Original file line number Diff line number Diff line change
Expand Up @@ -141,15 +141,14 @@ od_resource_parse_all <- function(resources, server = "ext") {
})
od <- lapply(parsed, attr, "od")

data.frame(
data_frame(
name = sapply(resources, function(x) x$name),
last_modified = lapply(od, function(x) x$last_modified) %>% do.call(c, .),
cached = lapply(od, function(x) x$cached) %>% do.call(c, .),
size = sapply(od, function(x) x$size),
download = vapply(od, function(x) x$download, 1.0),
parsed = sapply(od, function(x) x$parsed),
data = I(parsed %>% lapply(`attr<-`, "od", NULL)),
stringsAsFactors = FALSE
data = I(parsed %>% lapply(`attr<-`, "od", NULL))
)
}

Expand All @@ -171,9 +170,9 @@ od_resources_check <- function(json) {

od_normalize_columns <- function(x, suffix) {
if (!is.null(suffix)) {
col_indices <- c(1, 2, 2, switch(suffix, HEADER = 3, c(4, 3)))
col_indices <- c(1, 2, 2, switch(suffix, HEADER = 3, c(4, 3)), 5, 7)
col_names <- c("code", "label", "label_de", "label_en",
switch(suffix, HEADER = NULL, "parent"))
switch(suffix, HEADER = NULL, "parent"), "de_desc", "en_desc")
x <- x[, col_indices] %>% `names<-`(col_names)
x$label <- NA_character_
x$label_en <- as.character(x$label_en)
Expand Down Expand Up @@ -224,5 +223,8 @@ od_resource_all <- function(id, json = od_json(id), server = "ext") {
check_header(out$data[[2]])
out$data[[2]] %<>% od_normalize_columns("HEADER")
out$data[seq(3, nrow(out))] %<>% lapply(od_normalize_columns, "FIELD")
out %>% `class<-`(c("tbl", "data.frame"))
class(out$name) <- c("ogd_file", "character")
class(out$last_modified) <- c("sc_dttm", class(out$last_modified))
class(out$cached) <- c("sc_dttm", class(out$cached))
out
}
14 changes: 7 additions & 7 deletions R/od_revisions.R
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,15 @@
#' Use the `/revision` endpoint of the OGD server to get a list
#' of all datasets that have changed since a certain timestamp.
#' @param since (optional) A timestamp. If supplied, only datasets updated
#' later will be returned. Otherwise, all datasets are retured.
#' later will be returned. Otherwise, all datasets are returned.
#' Can be in either one of the following formats
#' * a native R time type that is compatible with `strftime()`
#' such as the return values of `Sys.Date()`, `Sys.time()` and `file.mtime()`.
#' * a string of the form `YYYY-MM-DD` to specify a day.
#' * a string of the form `YYYY-MM-DDThh:mm:ss` to specify a day and a time.
#' @param exclude_ext If `TRUE` (default) exclude all results that have
#' `OGDEXT_` as a prefix
#' @return a character verctor with dataset ids
#' @return a character vector with dataset ids
#' @inheritParams od_list
#' @examples
#' # get all datasets (including OGDEXT_*)
Expand Down Expand Up @@ -41,15 +41,15 @@ print.od_revisions <- function(x, ...) {
since <- attr(x, "since")
response <- attr(x, "response")
if (!is.null(since))
cli::cli_text("{.strong {length(x)}} changes between
cli::format_inline("{.strong {length(x)}} changes between
{.timestamp {attr(x, 'since')}} and
{.timestamp {response$date}}")
{.timestamp {response$date}}") %>% cat()
else
cli::cli_text("{.strong {length(x)}} datasets are available
({.timestamp {response$date}})")
cli::format_inline("{.strong {length(x)}} datasets are available ",
"({.timestamp {response$date}})\n") %>% cat()
if (length(x) > 0) {
y <- cli::cli_vec(x, list("vec-trunc" = 3))
cli::cli_text("{.strong ids}: {.emph {y}}")
cli::format_inline("{.strong ids}: {.emph {y}}") %>% cat()
}
invisible(x)
}
Expand Down
Loading