Calculate Kullback-Leibler divergence between two probability distributions. D_KL(Q || P) measures how much information is lost when P is used to approximate Q. The result is always non-negative, with 0 indicating identical distributions.
Value
Numeric value representing D_KL(Q || P), always >= 0. Returns 0 when distributions are identical.
Details
The KL divergence is calculated as: D_KL(Q || P) = sum(q * log(q / p))
Note that KL divergence is asymmetric: D_KL(Q || P) != D_KL(P || Q). When q[i] > 0 but p[i] = 0, the divergence is infinite. This implementation requires all elements of p to be positive when corresponding elements of q are positive.