-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@measure combinator - macros, TypeVars, and binops #9
Comments
Why is it necessary to include a free variable in the surface form, anyway? Is it supposed to be universally quantified in Also I'd check hygiene once more. This looks a bit suspicious. |
The problem is in this line:
|
But you were right, and I wasn't 🤷 |
I suppose you should introduce an equivalent of const $(gensym("basemeasuretype(Normal)")) = typeof((1 / sqrt2π) * Lebesgue(X))) and use that everywhere instead. |
But how does it know what |
The best way I see for solving this would probably be some |
I'd expect ((::Normal{P, X} ≪ ::M) where {P, X, M}) = M <: typeof((1 / sqrt2π) * Lebesgue(X)) to compile to a constant as well. It might even work if we define all three cases of |
Yep, that's it. For a while I had @measure Normal(μ,σ) ≃ Lebesgue but that assumes the base measure is "primitive", and doesn't give a way to do e.g. scaled base measures.
It's intentionally unhygienic, gensyms gum up the works in this case.
Ah, right. Maybe the syntax should replace
That's really interesting. Seems like it would work? And it just adds one method. I've been kind of assuming it will be really hard to get things fully transitive. I think I'd rather the user have the potential for a missing method here and there than a consistently large overhead while we generate lots of methods. Assuming it's easy enough to add more methods, anyway. But auto-adding an O(1) number of methods for a given case seems sensible. |
This seems to work: using MeasureTheory
using StatsFuns
##########################################
# Macro builds this part
struct Normal{P, X} <: MeasureTheory.AbstractMeasure{X}
par::P
end
function Normal(nt::NamedTuple)
P = typeof(nt)
return Normal{P, eltype(Normal{P})}(nt)
end
Normal(; kwargs...) = Normal((; kwargs...))
(baseMeasure(μ::Normal{P, X}) where {P, X}) = (1 / sqrt2π) * Lebesgue(eltype(Normal{P}))
Normal(μ, σ) = Normal(; μ, σ)
((::Normal{P, X} ≪ ::typeof((1 / sqrt2π) * Lebesgue(eltype(Normal{P})))) where {P, X}) = true
(::typeof((1 / sqrt2π) * Lebesgue(eltype(Normal{P}))) ≪ ::Normal{P, X}) where {P, X} = true
##########################################
# User adds this method
import Base
Base.eltype(::Type{Normal{P}}) where {P} = Real
##########################################
# Trying it out
julia> Normal()
Normal{NamedTuple{(),Tuple{}},Real}(NamedTuple())
julia> baseMeasure(Normal())
MeasureTheory.ScaledMeasure{Float64,Lebesgue{Real},Real}(-0.9189385332046728, Lebesgue{Real}())
julia> Normal(0.1,0.5)
Normal{NamedTuple{(:μ, :σ),Tuple{Float64,Float64}},Real}((μ = 0.1, σ = 0.5))
julia> Normal(μ=0.1, σ=0.5)
Normal{NamedTuple{(:μ, :σ),Tuple{Float64,Float64}},Real}((μ = 0.1, σ = 0.5)) |
Getting close! julia> using MeasureTheory
julia> using StatsFuns
julia> @measure Normal(μ,σ) ≃ (1/sqrt2π) * Lebesgue(X)
≪ (generic function with 2 methods)
julia> import Base
julia> Base.eltype(::Type{Normal{P}}) where {P} = Real
julia> Normal()
Normal{NamedTuple{(),Tuple{}},Real}(NamedTuple())
julia> baseMeasure(Normal())
MeasureTheory.ScaledMeasure{Float64,Lebesgue{Real},Real}(-0.9189385332046728, Lebesgue{Real}())
julia> Normal(0.1, 0.5)
Normal{NamedTuple{(:μ, :σ),Tuple{Float64,Float64}},Real}((μ = 0.1, σ = 0.5))
julia> Normal(μ=0.1, σ=0.5)
Normal{NamedTuple{(:μ, :σ),Tuple{Float64,Float64}},Real}((μ = 0.1, σ = 0.5)) but julia> Normal() ≪ baseMeasure(Normal())
ERROR: MethodError: no method matching ≪(::Normal{NamedTuple{(),Tuple{}},Real}, ::MeasureTheory.ScaledMeasure{Float64,Lebesgue{Real},Real})
Closest candidates are:
≪(::Normal{P,X}, ::MeasureTheory.ScaledMeasure{Float64,Lebesgue{Any},Any}) where {P, X} at /home/chad/git/Measures.jl/src/macros.jl:61
Stacktrace:
[1] top-level scope at REPL[10]:1 Here's what it's generating: julia> using MacroTools
julia> (@macroexpand @measure Normal(μ,σ) ≃ (1/sqrt2π) * Lebesgue(X)) |> MacroTools.prettify
quote
struct Normal{P, X} <: MeasureTheory.AbstractMeasure{X}
par::P
end
function Normal(nt::NamedTuple)
P = typeof(nt)
return Normal{P, eltype(Normal{P})}(nt)
end
Normal(; kwargs...) = Normal((; kwargs...))
(baseMeasure(μ::Normal{P, X}) where {P, X}) = (1 / sqrt2π) * Lebesgue(eltype(Normal{P}))
Normal(μ, σ) = Normal(; μ, σ)
((::Normal{P, X} ≪ ::typeof((1 / sqrt2π) * Lebesgue(eltype(Normal{P})))) where {P, X}) = true
((::typeof((1 / sqrt2π) * Lebesgue(eltype(Normal{P}))) ≪ ::Normal{P, X}) where {P, X}) = true
end resulting in julia> methods(≪)
# 2 methods for generic function "≪":
[1] ≪(::Normal{P,X}, ::MeasureTheory.ScaledMeasure{Float64,Lebesgue{Any},Any}) where {P, X} in Main at /home/chad/git/Measures.jl/src/macros.jl:61
[2] ≪(::MeasureTheory.ScaledMeasure{Float64,Lebesgue{Any},Any}, ::Normal{P,X}) where {P, X} in Main at /home/chad/git/Measures.jl/src/macros.jl:65 |
I'm baffled that this is accepted, but I fear it doesn't work as expected: ≪(::Normal{P,X}, ::MeasureTheory.ScaledMeasure{Float64,Lebesgue{Any},Any}) There's too many julia> g(x::X, ::typeof(Lebesgue(eltype(Vector{X})))) where {X<:Real} = ()
g (generic function with 1 method)
julia> methods(g)
# 1 method for generic function "g":
[1] g(x::X, ::Lebesgue{Any}) where X<:Real in Main at REPL[70]:1
julia> g(1, Lebesgue(Int))
ERROR: MethodError: no method matching g(::Int64, ::Lebesgue{Int64})
Closest candidates are:
g(::X, ::Lebesgue{Any}) where X<:Real at REPL[70]:1
Stacktrace:
[1] top-level scope at REPL[74]:1
julia> g(1, Lebesgue(String))
ERROR: MethodError: no method matching g(::Int64, ::Lebesgue{String})
Closest candidates are:
g(::X, ::Lebesgue{Any}) where X<:Real at REPL[70]:1
Stacktrace:
[1] top-level scope at REPL[75]:1
julia> g(1, Lebesgue(Any))
() probably due to it treating the inner julia> XX = TypeVar(:XX)
XX
julia> Vector{XX}
Array{XX,1}
julia> eltype(Vector{XX})
Any I think this not how Julia is supposed to work; I filed an issue. |
Anyway, I find that the syntax is doing too much at the same time, and don't like the free varible. What about this: # @measure Normal{X} ≃ (1/sqrt2π) * Lebesgue(X)
struct Normal{X, P} <: MeasureTheory.AbstractMeasure{X}
par::P
end
function Normal(nt::NamedTuple)
P = typeof(nt)
return Normal{peltype(Normal, P), P}(nt)
end
basemeasure(::Normal{X}) where {X} = (1/sqrt2π) * Lebesgue(X)
peltype(::Type{Normal}, ::Type) = error("No eltype defined for parametrization with $P")
# @parametrization Normal(μ, σ)::Normal{Real}
Normal(μ, σ) = Normal(; μ, σ)
peltype(::Type{Normal}, ::Type{::NamedTuple{(:μ, :σ), Tuple{Any, Any}}}) = Real
# we can also have complex constraints for some parametrization
# @parametrization Normal(a::T, b::T)::Normal{T} where {T<:Real}
Normal(a::T, b::T) where {T<:Real} = Normal(; a, b)
peltype(::Type{Normal}, ::Type{::NamedTuple{(:a, :b), Tuple{T, T}}}) where {T<:Real} = T In essence, we have to define, for each measure, a function that gives us the |
Hm, I like to avoid the |
I think I like it. I mean, it looks good, but there seem to be so many corner cases with this stuff. Let's try it and see how it goes. The I notice you leave out the |
Yes, that's right. But having it statically is important, since then we can dispatch on it. This is a big pain point in Distributions.jl. |
Dispatching on
It would be nice to have a bit of parallel infrastructure for those who are Measures but cannot inherit
|
@cscherrer Yes, I left them out for now, because they seem to be the complicated part. But I think the following is really getting readable: (mu::Normal{X} ≪ nu::AbstractMeasure) where {X} = typeof(nu) <: typeof((1/sqrt2π) * Lebesgue(X)) Altough I lack intuition whether let X = Any
@assert Core.Compiler.isconcretetype(typeof((1/sqrt2π) * Lebesgue(X)))
end And I was thinking whether a separate type would be appropirate for name-order canonicalization and the design of the trait for sample types: Normal(; kwargs...) = Normal(Parametrization(; kwargs...))
sampletype(::Normal, p::Parametrization) = error("No sample type defined for parametrization with $(typeof(p))")
# @parametrization Normal(μ, σ)::Normal{Real}
Normal(μ, σ) = Normal(; μ, σ)
sampletype(::Normal, ::Parametrization{(:μ, :σ), Tuple{Any, Any}}) = Real
# @parametrization Normal(; a::T, b::T)::Normal{T} where {T<:Real}
sampletype(::Normal{T}, ::Parametrization{(:a, :b), Tuple{T, T}}) where {T<:Real} = T where the |
This does look really nice, but I think it's missing a lot. The typeof(nu) <: typeof((1/sqrt2π) * Lebesgue(X)) implies (mu::Normal{X} ≪ nu::AbstractMeasure) In general, (mu::Normal{X} ≪ nu::AbstractMeasure) where {X} = Lebesgue(X) ≪ nu I guess we would get to
Is a |
How about function representative(μ)
# Check if we're done
isprimitive(μ) && return μ
ν = baseMeasure(μ)
# Make sure not to leave the equivalence class
ν ≪ μ || return μ
# Fall back on a recusive call
return representative(ν)
end
≪(μ, ν) = representative(μ) ≪ representative(ν) We could have some measures like Lebesgue considered "primitive" (not defined in terms of another measure) |
Yes, I thought it might be a good idea to distinguish "canonicalized parametrization" from arbitrary parametrization, allowing certain special operations to be provided by default, and stronger assumptions being made. E.g., one could have # in the struct, var"#par"::P
const PAR_FIELD = Symbol("#par")
function getproperty(m::Normal{<:Any, <:Parametrization}, name)
par = getfield(m, PAR_FIELD)
name == PAR_FIELD ? par : getproperty(par, name)
end which I wouldn't do for arbitrary types, but only if we know that the parametrization was set up in the special way given through the macro. And if we do this, I wouldn't assume
Yes, kind of -- function Normal(p::Parametrization)
P = typeof(p)
return Normal{sampletype(Normal, P), P}(p)
end which I forgot to include above. This constructor prevents the user from constructing arbitrary combinations between Maybe it should rather be written in trait style, I was just copying the approach as written by you before, because I liked it, but thought that |
Thanks @phipsgabler , I think I understand now. One possible consideration in the I agree |
In |
I think it's close to working, but...
[1] and [2] here are
Any suggestions on getting this to work properly? Here's the code the macro currently generates:
The text was updated successfully, but these errors were encountered: