: Shannon entropy of ; i.e., the average information required to describe a random variable .
: Joint entropy of and ; i.e., the average information required to describe both and .
: Conditional entropy of given ; i.e., the average information required to describe a random variable given that we already know .
: Mutual information of and ; i.e., the amount of information that each variable encodes about the other. (Mutual information is symmetric.)
Measures about distributions
: Cross-entropy of and ; i.e., the average information density of over the distribution of .
and sometimes : The Kullback-Leibler divergence of and ; i.e., a (non-symmetric) measure of distance between the distributions. The double bar is preferred because it conveys this asymmetry.