Entropy and—maybe even more so—mutual information are invaluable tools for analyzing properties of probability distributions, especially in coding theory. While there are general definitions for both concepts available for arbitrary probability distributions, these tend to be hard to work with and the literature (e.g., [CT06]) focuses on either discrete, or continuous random variables. In this ...