Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AutoDiff? #736

Closed
cscherrer opened this issue Feb 26, 2021 · 3 comments
Closed

AutoDiff? #736

cscherrer opened this issue Feb 26, 2021 · 3 comments

Comments

@cscherrer
Copy link

Hi,

I'd like to use ApproxFun for Bayesian modeling. So I have this function:

function smoothness(f::Fun)
	s = f.space
	∫ = DefiniteIntegral(s)
	∂² = Derivative(s,2)
	-* (∂² * f)^2
end

This seems to work well, for example I can do

s = Chebyshev()
(nt::NamedTuple) = smoothness(Fun(s,nt.x))

julia> ((x=randn(20),))
-6.885525125826199e6 on ApproxFunBase.AnyDomain()

But most samplers work in terms of autodiff, usually ForwardDiff by default. And I get this error:

julia> chain = initialize!(DynamicHMCChain, ℓ, tr)
ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#28#29"{LogDensityProblems.TransformedLogDensity{TransformVariables.TransformTuple{NamedTuple{(:x,), Tuple{TransformVariables.ArrayTransform{TransformVariables.Identity, 1}}}}, typeof(ℓ)}}, Float64}, Float64, 10})
Closest candidates are:
  (::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat at rounding.jl:200
  (::Type{T})(::T) where T<:Number at boot.jl:760
  (::Type{T})(::AbstractChar) where T<:Union{AbstractChar, Number} at char.jl:50
  ...
Stacktrace:
  [1] convert(#unused#::Type{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#28#29"{LogDensityProblems.TransformedLogDensity{TransformVariables.TransformTuple{NamedTuple{(:x,), Tuple{TransformVariables.ArrayTransform{TransformVariables.Identity, 1}}}}, typeof(ℓ)}}, Float64}, Float64, 10})
    @ Base ./number.jl:7
  [2] convert(#unused#::Type{Float64}, f::Fun{ConstantSpace{ApproxFunBase.AnyDomain, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#28#29"{LogDensityProblems.TransformedLogDensity{TransformVariables.TransformTuple{NamedTuple{(:x,), Tuple{TransformVariables.ArrayTransform{TransformVariables.Identity, 1}}}}, typeof(ℓ)}}, Float64}, Float64, 10}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{LogDensityProblems.var"#28#29"{LogDensityProblems.TransformedLogDensity{TransformVariables.TransformTuple{NamedTuple{(:x,), Tuple{TransformVariables.ArrayTransform{TransformVariables.Identity, 1}}}}, typeof(ℓ)}}, Float64}, Float64, 10}}})
    @ ApproxFunBase ~/.julia/packages/ApproxFunBase/WeoCn/src/Spaces/ConstantSpace.jl:70
  [3] setindex!(A::Vector{Float64}, x::Function, i1::Int64)
    @ Base ./array.jl:839

Do you know of a way to make this work with autodiff? Could the Float64 type constraint possibly be removed, or would this cause trouble elsewhere?

@dlfivefifty
Copy link
Member

your stacktrace is cut off so I have no idea what's calling setindex!. Likely the array type is wrong here.

@cscherrer
Copy link
Author

Sorry, I didn't think earlier calls were relevant. I think I found the problem, though. I expected smoothness to return a float, but instead it returns a Fun that happens to be constant. For example, this works:

julia> using ApproxFun

julia> function smoothness(f::Fun)
               s = f.space
               ∫ = DefiniteIntegral(s)
               ∂² = Derivative(s,2)
               result = -* (∂² * f)^2
               result.coefficients[1]
       end
smoothness (generic function with 1 method)

julia> s = Chebyshev()
Chebyshev()

julia> (x) = smoothness(Fun(s,x))
ℓ (generic function with 2 methods)

julia> using ForwardDiff

julia> ForwardDiff.gradient(ℓ, randn(10))
10-element Vector{Float64}:
      0.0
      0.0
    -37.724699060965875
  11184.992849921739
  -1902.2097186821084
  87554.55322301129
  -5927.948750101238
 268579.1170432344
 -21127.338515818305
 630881.0238039208

But result.coefficients[1] is probably very brittle. What's the right way to turn a constant Fun into a <: Real?

@dlfivefifty
Copy link
Member

i think Number(result)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants