目录
Information Theory, 1948
Entropy (熵)
Information
Entropy
Properties
Differential entropy (微分熵)
Joint Entropy and Conditional Entropy (联合熵,条件熵)
Relative entropy / KL divergence / KL distance (相对熵, KL 散度, KL 距离)
Mutual information (互信息)
Chain rules for entropy, relative entropy, and mutual information
Information inequality (信息不等式)
Information Theory, 1948
- Shannon, Claude Elwood. “A mathematical theory of communication.” ACM SIGMOBILE mobile computing and communications review 5.1 (2001): 3-55.
Entropy (熵)
Entropy
Properties
Differential entropy (微分熵)
Joint Entropy and Conditional Entropy (联合熵,条件熵)
Conditional Entropy
Joint Entropy
Chain Rule
评论(0)
您还未登录,请登录后发表或查看评论