-
Notifications
You must be signed in to change notification settings - Fork 223
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SMC arguments #426
Comments
Hi, As far as I can, I can obtain a matrix containing all particles from the It seems that the Is this currently possible? (Sorry if the answer is obvious) My other question referred to the resampling method and resampling threshold: is it possible to use any resampling method other than the default (resampleSystematic), and to change the threshold used for resampling? cheers, |
Hi Andrew ! Indeed Concerning your 2nd question, there is a Cheers, PS: This should change in the future to be easier to use cf #413. |
Hi Emile, thanks for your reply. I will try to modify the I'm wondering if there might be a problem with resampling. I've been playing around with a state space model (using SMC), and it seems that no matter how low I set the resampling threshold, the particles are resampled at the first iteration, which results in all particles having the same value. I have created an example here. Is this intended behaviour? cheers, |
Hi Andrew, In I'm not sure why that's happening. You should try to print May be related to the fact that the particles' weights are all initialised to 0. Cheers, |
ok, thanks. It'll take me a while to figure out how to do that, I'm pretty new to Julia. Any hints on how to print But even if resampling is performed, this shouldn't result in the particles all being descended from one ancestor, should it? |
Not necessarily indeed, but it could if you have to few particles and one of them therefore dominates the others (in terms of associated weights). You can use PS: you need to reload the module |
|
If you add |
ok, thanks, that worked ok. I'll spend some time looking at this. |
Hi Emile, thanks for your help. When is the And how is it possible that there are only a subset of unique particles after the first iteration, if there is no resampling at the first iteration? I have tried to show what I mean here. This is obviously a silly example, I just don't understand how each of the 10 particles has one of only three unique values if there is no resampling. cheers, |
The This seems weird indeed, i’ll Look into it Cheers, |
hi Emile, I still think it looks like something is going wrong at the first iteration. Changing the initialisation of the weights doesn't seem to have any effect. cheers, |
Hi Emile, sorry to keep bothering you about this, but I'm having problems with modifying the SMC code so that the weights of all particles are returned for each iteration. Currently, the The reason for doing this is that I want to do a model comparison at each iteration during a run of a particle filter. I realise that this may not really be the intended usage of the SMC algorithm, but are there any guidelines on how to proceed? best wishes, |
Hi Andrew, Something seems weird indeed. In Then at each iterations the weights are computed by Please tell me if that does not help enough. Cheers, |
ok, thanks, I'll try it tonight and let you know. |
ok, so far, adding |
You could update ParticleContainer, Consume and sample with the code below. In src/core/container.jl type ParticleContainer{T<:Particle}
model :: Function
num_particles :: Int
vals :: Array{T,1}
logWs :: Array{Float64,1} # Log weights (Trace) or incremental likelihoods (ParticleContainer)
logWst :: Array{Array{Float64,1},1} # Log weights saved BOTH in time and particle index
logE :: Float64 # Log model evidence
# conditional :: Union{Void,Conditional} # storing parameters, helpful for implementing rejuvenation steps
conditional :: Void # storing parameters, helpful for implementing rejuvenation steps
n_consume :: Int # helpful for rejuvenation steps, e.g. in SMC2
ParticleContainer{T}(m::Function,n::Int) where {T} = new(m,n,Array{Particle,1}(),Array{Float64,1}(),Array{Float64,1}(),0.0,nothing,0)
end
function Base.consume(pc :: ParticleContainer)
@assert pc.num_particles == length(pc)
# normalisation factor: 1/N
_, z1 = weights(pc)
n = length(pc.vals)
particles = collect(pc)
num_done = 0
logWst = zeros(Float64, n)
for i=1:n
p = pc.vals[i]
score = consume(p)
score = isa(score, ForwardDiff.Dual) ? realpart(score) : score
if isa(score, Real)
score += isa(getlogp(p.vi), ForwardDiff.Dual) ? realpart(getlogp(p.vi)) : getlogp(p.vi)
resetlogp!(p.vi)
increase_logweight(pc, i, Float64(score))
logWst[i] = Float64(score)
elseif score == Val{:done}
num_done += 1
else
error("[consume]: error in running particle filter.")
end
end
push!(pc.logWst, logWst)
if num_done == length(pc)
res = Val{:done}
elseif num_done != 0
error("[consume]: mis-aligned execution traces, num_particles= $(n), num_done=$(num_done).")
else
# update incremental likelihoods
_, z2 = weights(pc)
res = increase_logevidence(pc, z2 - z1)
pc.n_consume += 1
# res = increase_loglikelihood(pc, z2 - z1)
end
res
end In src/samplers/smc.j function sample(model::Function, alg::SMC)
spl = Sampler(alg);
particles = ParticleContainer{Trace}(model)
push!(particles, spl.alg.n_particles, spl, VarInfo())
while consume(particles) != Val{:done}
ess = effectiveSampleSize(particles)
if ess <= spl.alg.resampler_threshold * length(particles)
resample!(particles,spl.alg.resampler)
end
end
w, samples = getsample(particles)
res = Chain(w, samples)
return res, particles.logWst
end Then run the sampler and get the results chain, logWst = sample(StateSpaceModel_1(y, true), SMC(100)) Hope that this helps you ! |
Thank you! I guess these are the unnormalised weights, right? The output (logWst) is a vector of vectors, but the length is equal to n_iterations + 1, with the final weights vector consisting of zeros. |
They are unnormalised indeed. function Base.consume(pc :: ParticleContainer)
@assert pc.num_particles == length(pc)
# normalisation factor: 1/N
_, z1 = weights(pc)
n = length(pc.vals)
particles = collect(pc)
num_done = 0
logWst = zeros(Float64, n)
for i=1:n
p = pc.vals[i]
score = consume(p)
score = isa(score, ForwardDiff.Dual) ? realpart(score) : score
if isa(score, Real)
score += isa(getlogp(p.vi), ForwardDiff.Dual) ? realpart(getlogp(p.vi)) : getlogp(p.vi)
resetlogp!(p.vi)
increase_logweight(pc, i, Float64(score))
logWst[i] = Float64(score)
elseif score == Val{:done}
num_done += 1
else
error("[consume]: error in running particle filter.")
end
end
if num_done == length(pc)
res = Val{:done}
elseif num_done != 0
error("[consume]: mis-aligned execution traces, num_particles= $(n), num_done=$(num_done).")
else
# update incremental likelihoods
push!(pc.logWst, logWst)
_, z2 = weights(pc)
res = increase_logevidence(pc, z2 - z1)
pc.n_consume += 1
# res = increase_loglikelihood(pc, z2 - z1)
end
res
end |
Thank you very much! |
Hey ! Did you eventually succeeded @awellis ? |
@emilemathieu |
@emilemathieu got it working in Julia 1.0 as well. I forked the Turing.jl repository. I guess there must be a more elegant way of saving the weights, though, without hacking the |
@awellis would it be sufficient to store statistics on the weights? We could try to provide these in the resulting Chain object for particle-based methods by default. Storing all particle weights for each iteration, however, sounds rather costly. |
@trappmartin could be optional and set to False by default ? |
If it's not necessary, I would prefer to only store statistics on the particle weights. Are there cases, apart from debugging, where it would be necessary to return the particles and their weights for each iteration? |
@trappmartin @emilemathieu I think the history of the log evidence ( i.e. the sum of the log weights for each iteration) would actually be sufficient. |
@awellis's case ? |
I am using the weights so that I can visualize the SMC performance |
I can see the benefit but I'm still not convinced as storing weights for a few thousand particles over a reasonable number of iterations would easily generate a large memory overhead. Even with 16bit precision. And if I understand it right, visualizing statistics on the weights would actually be already quite informative. @awellis is this correct? I'll make a new issue for this as it seems to be a feature many people could be interested in. |
@emilemathieu @trappmartin visualizing statistics on the weights would be fine for me. |
Sorry if the answer to this question is obvious, but can the SMC function take other arguments than the number of particles, e.g. resampling threshold/method? Also, is it possible to retrieve the weights from the SMC sampler, along with the samples?
The text was updated successfully, but these errors were encountered: