In probability theory and information theory, the Kullback–Leibler divergence (also information divergence, information gain, relative entropy, KLIC, or KL divergence) is a non-symmetric measure of the difference between two probability distributions P and Q.
| Attributes | Values |
|---|---|
| rdfs:comment |
|
| differentFrom | |
| is known for of |