Building an intuition for Shannon entropy

Recall that the information content of an event , once it occurs, is .

Also recall that the expected value of a random variable is the average weighted over the probability of every outcome

Entropy is the expected information content of a distribution :

This is more typically written as the integral over the support of ,