KL散度(Kullback-Leibler Divergence)也叫做相对熵,用于度量两个概率分布之间的差异程度。

离散型

D K L ( P ∥ Q ) = ∑ i = 1 n P i l o g ( P i Q i ) D_{KL}(P \parallel Q)= \sum_{i=1}^{n}P_i log(\frac{P_i}{Q_i}) DKL(PQ)=i=1nPilog(QiPi)

比如随机变量 X ∼ P X \sim P XP取值为 1 , 2 , 3 1,2,3 1,2,3时的概率分别为 [ 0.2 , 0.4 , 0.4 ] [0.2, 0.4, 0.4] [0.2,0.4,0.4],随机变量 Y ∼ Q Y \sim Q YQ取值为 1 , 2 , 3 1,2,3 1,2,3时的概率分别为 [ 0.4 , 0.2 , 0.4 ] [0.4, 0.2, 0.4] [0.4,0.2,0.4],则:

D ( P ∥ Q ) = 0.2 × l o g ( 0.2 0.4 ) + 0.4 × l o g ( 0.4 0.2 ) + 0.4 × l o g ( 0.4 0.4 ) = 0.2 × − 0.69 + 0.4 × 0.69 + 0.4 × 0 = 0.138 \begin{aligned} D(P \parallel Q) & =0.2 \times log(\frac{0.2}{0.4}) + 0.4 \times log(\frac{0.4}{0.2}) + 0.4 \times log(\frac{0.4}{0.4}) \\ & =0.2 \times -0.69 + 0.4 \times 0.69 + 0.4\times0 \\ & = 0.138 \end{aligned} D(PQ)=0.2×log(0.40.2)+0.4×log(0.20.4)+0.4×log(0.40.4)=0.2×0.69+0.4×0.69+0.4×0=0.138

Python代码实现,离散型KL散度可通过SciPy进行计算:

from scipy import stats

P = [0.2, 0.4, 0.4]
Q = [0.4, 0.2, 0.4]
stats.entropy(P,Q) # 0.13862943611198905

P = [0.2, 0.4, 0.4]
Q = [0.5, 0.1, 0.4]
stats.entropy(P,Q) # 0.3712595980731252

P = [0.2, 0.4, 0.4]
Q = [0.3, 0.3, 0.4]
stats.entropy(P,Q) # 0.03397980735907956

KL散度的性质:

1、 D K L ( P ∥ Q ) ≥ 0 D_{KL}(P \parallel Q) \geq 0 DKL(PQ)0,即非负性
2、 D K L ( P ∥ Q ) ≠ D K L ( Q ∥ P ) D_{KL}(P \parallel Q) \ne D_{KL}(Q \parallel P) DKL(PQ)=DKL(QP),即不对称性

连续型

D K L ( P ∥ Q ) = ∫ − ∞ + ∞ p ( x ) l o g p ( x ) q ( x ) d x D_{KL}(P \parallel Q) =\int_{-\infty }^{+ \infty} p(x) log \frac{p(x)}{q(x)} dx DKL(PQ)=+p(x)logq(x)p(x)dx

(没怎么用到,后面再补吧)

Logo

开放原子开发者工作坊旨在鼓励更多人参与开源活动,与志同道合的开发者们相互交流开发经验、分享开发心得、获取前沿技术趋势。工作坊有多种形式的开发者活动,如meetup、训练营等,主打技术交流,干货满满,真诚地邀请各位开发者共同参与!

更多推荐