Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2x performance regression due to 5e80211c3302b5e7b79b4f670498f5a68af6659b #2399

Closed
lassepe opened this issue Mar 18, 2024 · 2 comments · Fixed by #2400
Closed

2x performance regression due to 5e80211c3302b5e7b79b4f670498f5a68af6659b #2399

lassepe opened this issue Mar 18, 2024 · 2 comments · Fixed by #2400
Labels

Comments

@lassepe
Copy link
Contributor

lassepe commented Mar 18, 2024

I noticed a 2x performance regression in one of my projects when updating from 0.14.12 to 0.14.13 and git-bisect points to 5e80211

The project in which I noticed this is difficult to share but doesn't do crazy things; basically training small MLPs (8 inputs, 3 hidden layers, hidden dimension 64, 10 outputs) with tanh activation.

@lassepe
Copy link
Contributor Author

lassepe commented Mar 18, 2024

The offender seems to be this deprecation warning:

function _check_new_macro(x::T) where T
Functors.isleaf(x) && return
Base.depwarn("This type should probably now use `Flux.@layer` instead of `@functor`: $T", Symbol("@functor"))
end

@lassepe
Copy link
Contributor Author

lassepe commented Mar 18, 2024

This is entirely due to the interpolation of $T into the string

@mcabbott mcabbott added the bug label Mar 18, 2024
aviatesk pushed a commit to JuliaLang/julia that referenced this issue Mar 21, 2024
Does what it says. Doc-changes only.

Recent motivating example: FluxML/Flux.jl#2399
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants