Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Particle Gibbs does not return log probability #643

Closed
colehurwitz opened this issue Jan 8, 2019 · 7 comments
Closed

Particle Gibbs does not return log probability #643

colehurwitz opened this issue Jan 8, 2019 · 7 comments

Comments

@colehurwitz
Copy link

Would it be possible to return the log probability when using PG?

@mohamed82008
Copy link
Member

If you have usage questions, please post a minimal working example. These questions are also more suited to the Julia slack (https://julialang.slack.com) channel #turing.

@cpfiffer
Copy link
Member

cpfiffer commented Jan 9, 2019

@colehurwitz31 You should be able to extract the log probability from the chain for each sample after you've run it using chain[:lp], with chain being the variable containing the value returned from sample. Does that work for you?

@xukai92
Copy link
Member

xukai92 commented Jan 9, 2019

@mohamed82008 @cpfiffer Sorry for the confusing - I was helping @colehurwitz31 with Turing and found this issue so I asked him to create an issue on this.

Yes PG indeed doesn't report log-joint in chain correct - all :lp in the chain are 0s.

using Turing

@model gdemo(x, y) = begin
    s ~ InverseGamma(10,3)
    m ~ Normal(0,sqrt(s))
    x ~ Normal(m, sqrt(s))
    y ~ Normal(m, sqrt(s))
    return s, m
end

x, y = 1.5, 2.0

chn = sample(gdemo(x, y), PG(20, 500))
chn[:lp] # => all 0s

@xukai92
Copy link
Member

xukai92 commented Jan 9, 2019

Because PG doesn't track the log-joint, there is no way to extract it directly - is my understanding correct? @yebai

I guess a simple solution is to evaluate the log-joint after each MCMC step, though it adds some more computation (ofc we should have an interface to make it optional). Otherwise we should provide some utility functions to re-evaluate the log-joint for each sample in the MCMC chain.

@yebai
Copy link
Member

yebai commented Jan 9, 2019

Because PG doesn't track the log-joint, there is no way to extract it directly - is my understanding correct? @yebai

That's right, the weight associated with each particle is reset to 1 after each resampling step. I think this feature has been brought up in the past, see related issues: https://github.com/TuringLang/Turing.jl/issues/493 #426. It's actually really simple to support this feature using API from https://github.com/TuringLang/Turing.jl/issues/634

logp(model, vi)

However, if we accumulate log weights and store them somewhere, we can avoid an extra call to runmodel (or logp(model, vi) using the new API from https://github.com/TuringLang/Turing.jl/issues/634).

@xukai92
Copy link
Member

xukai92 commented Jan 10, 2019

However, if we accumulate log weights and store them somewhere, ...

So it seems to me we'd add another field to the ParticleContainer right?

@yebai
Copy link
Member

yebai commented Nov 12, 2022

Likely out of date due to the effort of separating particle MCMC and SMC samplers into https://github.com/TuringLang/AdvancedPS.jl

@yebai yebai closed this as completed Nov 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants