Cyan's Blog

Search

Search IconIcon to open search

KL_Divergence-KL散度

Last updated Feb 11, 2022 Edit Source

# Kullback–Leibler divergence

2022-02-11

Tags: #Math/Probability #DeepLearning

1

  • Wikipedia: In mathematical statistics, the Kullback–Leibler divergence, $D _{KL} ( P ∥ Q )$ (also called relative entropy), is a statistical distance: a measure of how one probability distribution Q is different from a second, reference probability distribution P.

2

# 公式

# 性质3


  1. https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence#/media/File:KL-Gauss-Example.png ↩︎

  2. Intuitively Understanding the KL Divergence - YouTube ↩︎

  3. 相对熵 - 维基百科,自由的百科全书 ↩︎