-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi have you try your crossentropy loss #1
Comments
i'm not sure i get you correctly ,do you mean the term H(Ps,Pt)?. all I want to get is the kl_loss(Ps,Pt) = H(Ps,Pt) - H(Ps),so I think I try to get this from two parts: H(Ps)=ln(s)+2; and for H(Ps,Pt) term -- the cross entropy between teacher and student distributions, for a sampled x from student ,I can get Ps, and feed the x into teacher I get y_hat, so log_Pt = discretized_mix_logistic_loss(y_hat, x) (maybe the original y) if we specific the parameters |
@zhf459 ok,i will try it again, have you got any result? |
no,I got inf loss.can we chat on QQ or wechat for detail ? |
QQ:3617 28654 |
shall we work together?qq: 29 50 424 15 |
I really do not understand your code,i think that the crossentropy loss is like this:
loss=-p*(log(Q/P))+(-(1-P)log((1-Q)/(1-P)))
when P=Q the loss is zero
if loss=-P(log(Q/P))
when P=Q the loss is not at the minum,
The text was updated successfully, but these errors were encountered: