Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

For a 0.17.0 release #508

Merged
merged 20 commits into from
Jan 26, 2021
Merged

For a 0.17.0 release #508

merged 20 commits into from
Jan 26, 2021

Conversation

ablaom
Copy link
Member

@ablaom ablaom commented Jan 26, 2021

Some code re-organization:

  • Import all model and measure traits from new base package StatisticalTraits.jl, which are programatically re-exported. This resolves Decouple model traits from measure traits? #495
  • Get some utilities into StatisticalTraits.jl, previously defined locally
  • (breaking for users) measures(matching(...)) facility removed; private info_dic method and matching method moved to MLJModels

Also added optimisations for models buying into a new data front-end #501 . This may increase the default memory footprint of MLJBase operations.

ablaom and others added 20 commits December 28, 2020 14:30
Better error for bad call of transform on machines for Static models
add fit_only! logic that caches reformatted/resampled user-data

add test

minor comment change

"Bump version to 0.16.3"

resolve #499

add test

add tests for reformat front-end

oops

add model api tests for reformat front-end

tidy up

add test for reformat logic in learning network context

do not update state of a Machine{<:Static} on fit! unless hyperparams change

add tests

allow speedup buy-out with machine(model, args..., cache=false)

oops

have KNN models buy into reformat data front-end for better testing

introduce "back-end" resampling in evaluate!

implement data front-end on prediction/transformation side

update machine show

more tests; add cache=... to evaluate(model, ...)

more tests

make `cache` hyperparam of Resampler for passing onto wrapped machines

more tests

Composite models do not cache data by default (no benefit)

correct comment

bump [compat] MLJModelInterface = "^0.3.7" (essential)
Realize performance improvements for models implementing new data front-end
@codecov-io
Copy link

Codecov Report

Merging #508 (a2eee18) into master (6328ebc) will increase coverage by 0.32%.
The diff coverage is 80.55%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #508      +/-   ##
==========================================
+ Coverage   82.71%   83.04%   +0.32%     
==========================================
  Files          39       37       -2     
  Lines        2934     2914      -20     
==========================================
- Hits         2427     2420       -7     
+ Misses        507      494      -13     
Impacted Files Coverage Δ
src/composition/learning_networks/machines.jl 92.24% <ø> (ø)
src/composition/models/pipelines.jl 98.52% <ø> (-0.09%) ⬇️
src/measures/measure_search.jl 46.15% <ø> (+16.74%) ⬆️
src/machines.jl 83.22% <75.00%> (-0.80%) ⬇️
src/operations.jl 76.19% <81.81%> (+1.19%) ⬆️
src/resampling.jl 85.91% <81.81%> (-0.11%) ⬇️
src/MLJBase.jl 100.00% <100.00%> (ø)
src/composition/models/methods.jl 53.48% <100.00%> (+1.10%) ⬆️
src/measures/confusion_matrix.jl 92.55% <100.00%> (ø)
src/measures/measures.jl 89.47% <100.00%> (+1.71%) ⬆️
... and 3 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6328ebc...a2eee18. Read the comment docs.

@ablaom ablaom merged commit 4ccfa15 into master Jan 26, 2021
@ablaom
Copy link
Member Author

ablaom commented Jan 26, 2021

@JuliaRegistrator register

@JuliaRegistrator
Copy link

Comments on pull requests will not trigger Registrator, as it is disabled. Please try commenting on a commit or issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants