-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
pymc3 multinomial mixture gets stuck #3069
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
You are probably better off reporting the NITS issues, and getting that to work. Metropolis won’t work as well, and is liable to get stuck, as it appears to be doing. |
@fonnesbeck
I get a "Bad initial energy" error.
I also printed the logp values. The
gives,
|
Just let it do the default initialization. So, don’t manually assign a step method. Also, you don’t need 10k iterations. Try just:
|
@fonnesbeck It still doesn't work. I have pushed a notebook However, since we last chatted I was able to make it work using categorical distribution instead of a multinomial distribution. Note, I was using multinomial with n=1, so I could substitute it with a categorical distribution. Any idea why one works and the other does not? |
@antjoseme did you ever resolve this? I am working on something similar and any insight would be greatly valued. The switch from Multinomial to Categorical is interesting, but the notebooks linked are no longer there. Could you repost them? |
Hi @breadbaron the notebooks should be available now |
@antjoseme thanks! |
This was accidentally fixed by #3059 It had to do with with the last row of the import numpy as np
import pymc3 as pm
state_label_tran = np.array([
[0.3, 0.2, 0.5],
[0.1, 0.5, 0.4],
[0.0, 0.05, 0.95], # Would work fine without this row
])
shape = len(state_label_tran)
with pm.Model() as m:
label_dist = [
pm.Multinomial.dist(p=state_label_tran[i], n=1, shape=shape)
for i in range(len(state_label_tran))
]
label = pm.Mixture('label', w=np.ones(shape)/shape, comp_dists=label_dist, observed=[0, 1, 0])
print(label.logp()) # nan before PR #3059, and -1.386... afterwards In fact it had nothing to do with the mixture, but with the Multinomial: with pm.Model() as m:
label = pm.Multinomial('m', n=1, p=[0, 0.05, 0.95], observed=[0, 1, 0])
print(label.logp()) # nan before, and -2.9957 afterwards None of our current Multinomial tests check for a probability vector that includes zeros (except for |
Definitely |
Closing in favor of #4487 |
I am trying use PYMC3 to implement an example where the data comes from a mixture of multinomials. The goal is to infer the underlying state_prob vector (see below). The code runs, but the Metropolis sampler gets stuck at the initial state_prior vector. Also, for some reason I have not been able to get NUTS to work.
The text was updated successfully, but these errors were encountered: