-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AutoDiff? #736
Comments
your stacktrace is cut off so I have no idea what's calling |
Sorry, I didn't think earlier calls were relevant. I think I found the problem, though. I expected julia> using ApproxFun
julia> function smoothness(f::Fun)
s = f.space
∫ = DefiniteIntegral(s)
∂² = Derivative(s,2)
result = -∫ * (∂² * f)^2
result.coefficients[1]
end
smoothness (generic function with 1 method)
julia> s = Chebyshev()
Chebyshev()
julia> ℓ(x) = smoothness(Fun(s,x))
ℓ (generic function with 2 methods)
julia> using ForwardDiff
julia> ForwardDiff.gradient(ℓ, randn(10))
10-element Vector{Float64}:
0.0
0.0
-37.724699060965875
11184.992849921739
-1902.2097186821084
87554.55322301129
-5927.948750101238
268579.1170432344
-21127.338515818305
630881.0238039208 But |
i think |
Hi,
I'd like to use ApproxFun for Bayesian modeling. So I have this function:
This seems to work well, for example I can do
But most samplers work in terms of autodiff, usually
ForwardDiff
by default. And I get this error:Do you know of a way to make this work with autodiff? Could the
Float64
type constraint possibly be removed, or would this cause trouble elsewhere?The text was updated successfully, but these errors were encountered: