Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hi have you try your crossentropy loss #1

Open
jiqizaisikao opened this issue Mar 10, 2018 · 5 comments
Open

Hi have you try your crossentropy loss #1

jiqizaisikao opened this issue Mar 10, 2018 · 5 comments

Comments

@jiqizaisikao
Copy link

jiqizaisikao commented Mar 10, 2018

I really do not understand your code,i think that the crossentropy loss is like this:
loss=-p*(log(Q/P))+(-(1-P)log((1-Q)/(1-P)))
when P=Q the loss is zero
if loss=-P
(log(Q/P))
when P=Q the loss is not at the minum,

@zhf459
Copy link
Owner

zhf459 commented Mar 12, 2018

i'm not sure i get you correctly ,do you mean the term H(Ps,Pt)?. all I want to get is the kl_loss(Ps,Pt) = H(Ps,Pt) - H(Ps),so I think I try to get this from two parts: H(Ps)=ln(s)+2; and for H(Ps,Pt) term -- the cross entropy between teacher and student distributions, for a sampled x from student ,I can get Ps, and feed the x into teacher I get y_hat, so log_Pt = discretized_mix_logistic_loss(y_hat, x) (maybe the original y) if we specific the parameters reduce=False ,and finally H(Ps,Pt) = -sum(log_Pt*Ps)-H(Ps)(maybe wrong). kl loss is final loss.

@jiqizaisikao
Copy link
Author

@zhf459 ok,i will try it again, have you got any result?

@zhf459
Copy link
Owner

zhf459 commented Mar 12, 2018

no,I got inf loss.can we chat on QQ or wechat for detail ?

@jiqizaisikao
Copy link
Author

QQ:3617 28654

@neverjoe
Copy link

shall we work together?qq: 29 50 424 15

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants