CROSS ENTROPY

현재는 백엔드 개발이 메인 업무이지만 전공도 복기해두어야 한다는 생각에 nlp관련 글도 포스팅해보려고 합니다.
그래서 이렇게 뜬금없게 cross entropy에 대한 포스트를 올리게 되었습니다.

This post introduces concepts about cross entropy.
In addition, the post effectively added the points perplexity is similar concepts.

To begin with, cross entropy is the metric for two different distributions.
It is because it compare how two distribution is similar.

Second, cross entropy is expectation value of information value from distribution Q where distribution P have seen.
Perplexity is root value of log probability.